• 8 Posts
  • 716 Comments
Joined 2 years ago
cake
Cake day: October 4th, 2023

help-circle
  • “AI’s natural limit is electricity, not chips,” Schmidt said, cutting through the industry’s semiconductor obsession with characteristic bluntness.

    I mean, maybe in the very long term that’s a fundamental limit, and you face things like Dyson spheres.

    But right now, I’m personally running one human-level AGI on roughly 100W of power, so I’m just gonna say that as things stand, the prominent limitation is software not being good enough. You’re, like, a software guy.

    Ultimately AI is an optimization problem, and if we don’t know how to solve the software problems fully yet, then, yeah, we can be inefficient and dump some of the heavy lifting on the hardware guys to get a small edge.

    But I’m pretty sure that the real breakthrough that needs to happen isn’t on the hardware side. Like, my existing PC and GPU already are more capable than my brain from a hardware standpoint. The hardware guys have already done their side and then some compared to human biology. It’s that we haven’t figured out the software to run on them to make them do what we want.

    The military or whoever needs AI applications can ask for more hardware money to get an edge relative to competitors. But if you’re the (well, ex-) head of Google, you’re where a lot of those software and computer science guys who need to make the requisite software breakthroughs probably are, or could be. Probably the last people who should be saying “the hardware guys need to solve this”.

    It’s going to be some more profound changes to what we’re doing in software today than just tweaking the parameters on some LLM model, too. There’s probably some hard research work that has to be done. It’s not “we need immense resources dumped into manufacturing more datacenters and powerplants, and chips”. It’s translating money into having some nerdy-looking humans bang away in some office somewhere and figure out the required changes to what needs to be done in software to get us there. Once that happens, then okay, sure, one needs hardware to make use of that software. But in July 2025, we don’t have the software to run on that hardware, not yet.



  • tal@lemmy.todaytoComic Strips@lemmy.worldWhen life was full of wonder
    link
    fedilink
    English
    arrow-up
    4
    arrow-down
    1
    ·
    edit-2
    7 days ago

    IIRC, they no longer print it, but you can probably buy used collections.

    kagis

    Yeah. The final print edition was 2010:

    https://en.wikipedia.org/wiki/Encyclopædia_Britannica

    The Encyclopædia Britannica (Latin for ‘British Encyclopaedia’) is a general-knowledge English-language encyclopaedia. It has been published by Encyclopædia Britannica, Inc. since 1768, although the company has changed ownership seven times. The 2010 version of the 15th edition, which spans 32 volumes[1] and 32,640 pages, was the last printed edition. Since 2016, it has been published exclusively as an online encyclopaedia at the website Britannica.com

    Printed for 244 years, the Britannica was the longest-running in-print encyclopaedia in the English language. It was first published between 1768 and 1771 in Edinburgh, Scotland, in three volumes.

    Copyright (well, under US law, and I assume elsewhere) also doesn’t restrict actually making copies, but distributing those copies. If you want to print out a hard copy of the entire Encyclopedia Britannica website for your own use in the event of Armageddon, I imagine that there’s probably software that will let you do that.


  • tal@lemmy.todaytoComic Strips@lemmy.worldWhen life was full of wonder
    link
    fedilink
    English
    arrow-up
    37
    arrow-down
    1
    ·
    edit-2
    7 days ago

    I mean, the bar to go get a reference book to look something up is significantly higher than “pull my smartphone out of my pocket and tap a few things in”.

    Here’s an article from 1945 on what the future of information access might look like.

    https://www.theatlantic.com/past/docs/unbound/flashbks/computer/bushf.htm

    The Atlantic Monthly | July 1945

    “As We May Think”

    by Vannevar Bush

    Eighty years ago, the stuff that was science fiction to the people working on the cutting edge of technology looks pretty unremarkable, even absurdly conservative, to us in 2025:

    Like dry photography, microphotography still has a long way to go. The basic scheme of reducing the size of the record, and examining it by projection rather than directly, has possibilities too great to be ignored. The combination of optical projection and photographic reduction is already producing some results in microfilm for scholarly purposes, and the potentialities are highly suggestive. Today, with microfilm, reductions by a linear factor of 20 can be employed and still produce full clarity when the material is re-enlarged for examination. The limits are set by the graininess of the film, the excellence of the optical system, and the efficiency of the light sources employed. All of these are rapidly improving.

    Assume a linear ratio of 100 for future use. Consider film of the same thickness as paper, although thinner film will certainly be usable. Even under these conditions there would be a total factor of 10,000 between the bulk of the ordinary record on books, and its microfilm replica. The Encyclopoedia Britannica could be reduced to the volume of a matchbox. A library of a million volumes could be compressed into one end of a desk. If the human race has produced since the invention of movable type a total record, in the form of magazines, newspapers, books, tracts, advertising blurbs, correspondence, having a volume corresponding to a billion books, the whole affair, assembled and compressed, could be lugged off in a moving van. Mere compression, of course, is not enough; one needs not only to make and store a record but also be able to consult it, and this aspect of the matter comes later. Even the modern great library is not generally consulted; it is nibbled at by a few.

    Compression is important, however, when it comes to costs. The material for the microfilm Britannica would cost a nickel, and it could be mailed anywhere for a cent. What would it cost to print a million copies? To print a sheet of newspaper, in a large edition, costs a small fraction of a cent. The entire material of the Britannica in reduced microfilm form would go on a sheet eight and one-half by eleven inches. Once it is available, with the photographic reproduction methods of the future, duplicates in large quantities could probably be turned out for a cent apiece beyond the cost of materials.

    If the user wishes to consult a certain book, he taps its code on the keyboard, and the title page of the book promptly appears before him, projected onto one of his viewing positions. Frequently-used codes are mnemonic, so that he seldom consults his code book; but when he does, a single tap of a key projects it for his use. Moreover, he has supplemental levers. On deflecting one of these levers to the right he runs through the book before him, each page in turn being projected at a speed which just allows a recognizing glance at each. If he deflects it further to the right, he steps through the book 10 pages at a time; still further at 100 pages at a time. Deflection to the left gives him the same control backwards.

    A special button transfers him immediately to the first page of the index. Any given book of his library can thus be called up and consulted with far greater facility than if it were taken from a shelf. As he has several projection positions, he can leave one item in position while he calls up another. He can add marginal notes and comments, taking advantage of one possible type of dry photography, and it could even be arranged so that he can do this by a stylus scheme, such as is now employed in the telautograph seen in railroad waiting rooms, just as though he had the physical page before him.





  • “Fallout is the big one,” Middler claimed. “There are multiple Fallout projects in development, including, as far as I’m aware, that one that I’m sure you’re all wanting. It’s not far enough in along to say anything like ‘you’re going to be playing this game anytime soon’.”

    Middler then joked, “Anyway, New Vegas 2, coming soon”. Is this the one we’re “all wanting”? Yes, but then also so is Fallout 3 Remastered, Fallout 5 and even a remake of Fallout 2. The fanbase is rabid, and hungry, and it’s been a long time since they’ve been fulfilled outside of Fallout 76 updates.

    I mean, if Bethesda released all four of those, I’d buy all four.

    I also don’t know what “Fallout 3 Remastered” entails, but if it means forward-porting the content to Starfield’s engine, that’d be pretty cool, though I do wonder how much effort will be required for mod-porting.













  • As @[email protected] said.

    https://en.wikipedia.org/wiki/Active_noise_control

    Historically, if you were in a noisy environment, you could get closed-back, circumaural headphones — headphones that fit around your ears and had a lot of sound-absorption padding — to help soak up the sound. I still use decent non-ANC circumaural headphones at home.

    There are also some people who are more-willing to tolerate discomfort than I am who get in-ear buds, which block noise in their ear canal, and on top of that, fit ear protectors intended for industrial use, like 3M X5 Peltor ear protectors, which have even more passive sound absorption stuff than current circumaural headphones do, and are even larger.

    That sort of thing works well on higher frequency sound, but not as well on low-frequency stuff, like engine noise, large fans, stuff like that.

    ANC basically has microphones in your headphones, picks up on what sounds are showing up at your ear, and then tries to compute and play back a sound that produces destructive interference at your ear. That is, if you look at the sound waves, where the environmental sound is low pressure, it plays back high pressure signal, and when the environmental sound is high pressure, it plays back low pressure signal. It’s not perfect, or it could make environmental sound totally inaudible. But high-end ANC headphones are pretty impressive these days. I have a pair of Sennheiser Momentum 4 headphones — good, though not the best ANC out there in 2025, and I don’t personally recommend these for other reasons — and when they kick on, the headphones are designed to have the ANC fade in; same thing happens in reverse, fades out when you flip the ANC off. It sounds almost as if fans and the like around you are powering up and down when that happens, very eerie if you’ve never experienced it before. Even the sounds that it doesn’t do so well on, like people talking, it significantly reduces in volume.

    And ANC does best with the other side of the spectrum, the side that passive sound absorption doesn’t — the low-frequency stuff, especially regular sounds like fans. So having both a lot of passive sound absorption and ANC on a given pair of headphones let the two work well together.

    People often use cell phones in noisy environments, with a lot of people around, and ANC makes it a lot easier to hear music or whatever without background sound interfering. I think that it’s very likely that people will, long term, mostly wind up using headphones with ANC (short of moving to something more elaborate like a direct brain interface or something). It’s not really all that important if you’re in a quiet environment, and I don’t bother using ANC headphones on my desktop at home. But if you’re in random environments — waiting a grocery store line, in a restaurant with music playing over the restaurant’s speakers, on an airplane with the drone of the airplane engines, whatever — it really helps to reduce that background sound. ANC isn’t that new. I think that I remember it mostly being billed as useful for airplane engine noise back when, which they’re a good fit for. But it’s gotten considerably better over the years. For me, in 2025, good ANC is something that I really want to have for smartphone use.

    The problem is that in order to do ANC, you need at least a microphone, preferably an array, and somewhere you need to have a model of the sound transmission through the headphones and be running signal processing on the input sound to generate that output sound. In theory, you could do it on an attached computer if you had a fast data interface, but in practice, ANC-capable headphones are sold as self-contained units that handle all that themselves. So you gotta power the little computer in the headphones. That means that you probably have batteries and at least for full size headphones (rather than earbuds) you might as well stick a USB interface on them to charge them, even if the user is using Bluetooth for wireless connectivity. And if you’ve done that, it isn’t much more circuitry to just let the headphones act as USB headphones, so in general, ANC headphones tend to also be USB-capable. My Momentum 4 headphones have all of Bluetooth, USB-C, and a traditional headphones interface, but…I just haven’t really wound up using the headphones interface if I have the other options available on a given device. Might be convenient if I were using some device that only had headphones output. shrugs


  • I mean, there were legitimate technical issues with the standard, especially on smartphones, which is where they really got pushed out. Most other devices do have headphones jacks. If I get a laptop, it’s probably got a headphones jack. Radios will have headphones jacks. Get a mixer, it’s got a headphones jack. I don’t think that the standard is going to vanish anytime soon in general.

    I like headphones jacks. I have a ton of 1/8" and 1/4" devices and headphones that I happily use. But they weren’t doing it for no reason.

    • From what I’ve read, the big, driving one that drove them out on smartphones was that the jack just takes up a lot more physical space in the phone than USB-C or Bluetooth. I’d rather just have a thicker phone, but a lot of people wouldn’t, and if you’re going all over the phone trying to figure out what to eject to buy more space, that’s gonna be a big target. For people who do want a jack on smartphones, which invariably have USB-C, you can get a similar effect to having a headphones jack by just leaving a small USB-C audio interface with a headphones jack on the end of your headphones (one with a passthrough USB-C port if you also want to use the USB-C port for charging).

    • A second issue was that the standard didn’t have a way to provide power (there was a now-dead extension from many years back, IIRC for MD players, that let a small amount of power be provided with an extra ring). That didn’t matter for a long time, as long as your device could put out a strong enough signal to drive headphones of whatever impedance you had. But ANC has started to become popular now, and you need power for ANC. This is really the first time I think that there’s a solid reason to want to power headphones.

    • The connection got shorted when plugging things in and out, which could result in loud sound on the membrane.

    • USB-C is designed so that the springy tensioning stuff that’s there to keep the connection solid is on the (cheap, easy to replace) cord rather than the (expensive, hard to replace) device; I understand from past reading that this was a major reason that micro-USB replaced mini-USB. Instead of your device wearing out, the cord wears out. Not as much of an issue for headphones as mini-USB, but I think that it’s probably fair to say that it’s desirable to have the tensioning on the cord side.

    • On USB-C, the right part breaks. One irritation I have with USB-C is that it is…kind of flimsy. Like, it doesn’t require that much force pushing on a plug sideways to damage a plug. However — and I don’t know if this was a design goal for USB-C, though I suspect it was — my experience has been that if that happens, it’s the plug on the (cheap, easy to replace) cord that gets damaged, not the device. I have a television with a headphones jack that I destroyed by tripping over a headphones cord once, because the headphones jack was nice and durable and let me tear components inside the television off. I’ve damaged several USB-C cables, but I’ve never damaged the device they’re connected to while doing so.

    On an interesting note, the standard is extremely old, probably one of the oldest data standards in general use today; the 1/4" mono standard was from phone switchboards in the 1800s.

    EDIT: Also, one other perk of using USB-C instead of a built-in headphones jack on a smartphone is that if the DAC on your phone sucks, going the USB-C-audio-interface route means that you can use a different DAC. Can’t really change the internal DAC. I don’t know about other people, but last phone I had that did have an audio jack would let through a “wub wub wub” sound when I was charging it on USB off my car’s 12V cigarette lighter adapter — dirty power, but USB power is often really dirty. Was really obnoxious when feeding my car’s stereo via its AUX port. That’s very much avoidable for the manufacturer by putting some filtering on the DAC’s power supply, maybe needs a capacitor on the thing, but the phone manufacturer didn’t do it, maybe to save space or money. That’s not something that I can go fix. I eventually worked around it by getting a battery-powered Bluetooth receiver that had a 1/8" headphones jack, cutting the phone’s DAC out of the equation. The phone’s internal DAC worked fine when the phone wasn’t charging, but I wanted to have the phone plugged in for (battery hungry) navigation stuff when I was driving.