An Update on Apple M1/M2 GPU Drivers

(lwn.net)

540 points | by MrBuddyCasino19 小时前

17 comments

  • whitehexagon9 小时前
    What I really respect is the dedication to completing support for the M1/M2. Too many projects get dropped the moment the next shiny ray-traced toy comes along.

    I know this type of work can be challenging to say the least. My own dabble with Zig and Pinephone hardware drivers reminded me of some of the pain of poorly documented hardware, but what a reward when it works.

    My own M1 was only purchased because of this project and Alyssa's efforts with OpenGL+ES. It only ever boots Asahi Linux. Thank-you very much for your efforts.

    • Cthulhu_7 小时前
      One thing I noticed in the M4 macbook announcement comments was how many people were happy with their M1 laptop, and second, how many people kept their Macbooks for nearly a decade; these devices are built to last, and I applaud long-term support from Apple itself and the Linux community.

      Second, since it's open source, Apple themselves are probably paying attention; I didn't read the whole thing because it's going over my head, but she discussed missing features in the chip that are being worked around.

      • attendant34464 小时前
        I have the opposite experience. Apple is incredibly difficult and expensive to repair. But I have been pleasantly surprised by the longevity and repairability of ThinkPads. I like those Apple M processors, but I know where I'm spending my money.
        • windowsrookie3 小时前
          Yes, MacBooks are generally more expensive to repair, but they also tend to not need repairs. It’s quite normal to hear from people who are using their 7+ year old MacBook without any issues and are still perfectly happy with it. I myself still use my 2018 MacBook Pro as my main device.

          When considering longevity, I will agree that Thinkpads are probably the only device that can compete with MacBooks. But there are trade-offs.

          MacBooks are going to be lighter, better battery life, and have better displays. Not to mention MacOS, which is my preferred OS.

          Thinkpads usually have great Linux support and swappable hardware for those who like to tinker with their devices. They also tend to be more durable, but this adds more weight.

          • MSFT_Edging1 小时前
            > MacBooks are going to be lighter

            Not going let Macs have this one, my X1 carbon is considerably lighter than a MBA.

            But generally agreeing. My last X1C lasted 8 years and still works perfectly, I just wanted an upgrade. My new one is even lighter than my old one. I opt for the FHD w/o touch screen and the second best processor to balance battery life and performance. Definitely not getting 18hrs battery life but 8-10 isn't something to laugh at.

            • windowsrookie22 分钟前
              I admit I was assuming they would be heavier, I didn't consider the X1 Carbon. When I think of Thinkpads I still picture the traditional IBM-style Thinkpad. A quick look at the specs shows the 14" X1 Carbon at 2.42lbs, 13" MacBook Air at 2.7 lbs, and a 14" Thinkpad E series at 3.17 lbs.
          • FuriouslyAdrift1 小时前
            MacBooks are some of the heaviest laptps on the market.

            The Air is the "light" one at 2.7 lbs 13" and 3.3 lbs for the 15"

            For reference, there are several 13" laptops on the market that are around 2 - 2.5 lbs and 15"+ that are less than 3 lbs

            • windowsrookie21 分钟前
              Do any of those lighter laptops match the battery life and performance of the MacBook Air, while also being completely silent? I suppose I should have been more specific and stated I don't believe there are any laptops that can match the MacBook in all categories while being lighter.
          • majormajor1 小时前
            I haven't gotten a new Thinkpad since the 25th anniversary one but that was the last I bought while using a few as daily drivers for a decade since 2008.

            The ultimate issue was that the batteries degraded on them on incredibly fast. I don't know if that's been fixed, but the ease of replacing (at least one of) the batteries was more than canceled out by the short life compared to a Mac.

          • Dah00n1 小时前
            >MacBooks are going to be lighter

            That sounds like an Apple sound bite, and it is wrong, compared to pretty much any MacBook competitor out there...

          • throwaway241241 小时前
            I still use a 2014 macbook air as my home server. It was my daily driver for 7 years. No complaints, still works perfectly. Haven't been careful with it either.
        • crossroadsguy39 分钟前
          Indeed. However once you need the repair it's so daunting. Now factor in non-developed nations (prices are usually same or more there as well for both parts and service) and it's just insane. I had a 7-8 year old macbook air that I had bought for ~60K INR and I had to replace its SSD. Just because it was do be done for an Apple device even the outside repairperson charged ~16K (Apple "authorised" service centre quoted 28K + taxes with a straight face). That outside repair was way too costly in late 2022 for a 128GB SSD. Same goes for their other devices.

          So what's to be done? Buy their insanely costly "limited" extended warranty for "2 more" years? And let's say we do that, then? I am sure there is an argument for that.

          I am typing this from a M1 MacBook Pro and if it dies I am really not sure whether I will even be able to get it repaired without "feeling" fleeced, or might as well move back to the normal laptop + Linux world and know that a laptop repair will never be a "minor bankruptcy" event ;-)

          No, "but Apple devices last long" doesn't cut it. So do non-Apple devices, yes they do. And if they need repair you don't at all fret w/ or w/o warranty.

          I am not sure how many here on HN will be able to connect with this but that hanging Damocles' sword is not a nice thing to have around when you use a costly Apple device.

          Making it easy and cheap/affordable for their devices to be repaired should not be an option left for OEMs.

        • javier21 小时前
          Yes, modern macs are stupidly hard to repair. But I am never using any Lenovo product again since the whole rootkit incident.
        • jrochkind11 小时前
          Agreed, hard to repair (which is a problem), but my experience is owning one for 8 years, and another for 4 years now (with hopes to keep it another 6), which never once needed a repair.
        • jhenkens3 小时前
          I agree on modern Mac's being difficult to repair. I also will say that back a decade or two ago, it was likely you'd need to repair your computer after four years. Now, a four year old Macbook Air still feels brand new to me.
      • duxup4 小时前
        People talk about Apple prices being higher but the longevity of their devices really erases that price difference for me.

        I still have an old iPhone 8 that I test with that runs well, I’ve had numerous Android devices die in that timeframe, slow to a crawl, or at best their performance is erratic.

        • BirAdam3 小时前
          I am not so sure. I had a Pentium4 eMachine that made it all the way to around 2012. The same kind of life for my first Athlon64 machine from the same company. In both cases, the machines ran Linux, and they still work today. They were only retired because no meaningful software could run on them any longer. My PowerBook G4 got no where near that longevity, and owners of late era Intel MacBooks are probably going to have similar issues.
          • bluGill3 小时前
            Those machines should have been retired long ago because modern machines use much less power. (the Pentium4 was really bad) Of course there is a difference between a machine that you turn on for an hour once a week and a machine you leave on 24x7. The later use would pay for a new computer in just a few years just from savings on the electric bill (assuming same workload, if you then load the new computer with more things to do you can't see that)
            • bityard2 小时前
              The non-trivial environmental damage of building the new computer AND disposing of the old one should not be ignored. There are plenty of us who would rather pay a few extra bucks on our power bill rather than send e-waste to the landfill before it needs to be.
              • bluGill1 小时前
                You are instead sending CO2 into the atmosphere. (depending on how your get your power, but for most of us burning things is how we get our power. Renewables are coming online fast though)
          • duxup58 分钟前
            Longevity (and certainly not performance over time) is not something I've ever herd associated with an eMachine. I think that machine might be a outlier.
        • pjmlp1 小时前
          I still manage PC that are about 15 years old, working happily for what their users care about.
        • reaperducer1 小时前
          I still have an old iPhone 8 that I test with that runs well

          I have an iPhone 3G from 2008 that is currently playing music as I work.

          After 16 years, it still syncs perfectly with Apple Music on the current version of macOS.

          Unfortunately, since it can't handle modern encryption, I can't get it to connect to any wifi access point.

        • BolexNOLA3 小时前
          I also point people to the Mac mini line whenever they talk about price. The new M4 at ~$700 is a steal, but they have been affordable since the m1 refresh. Real “bang for your buck” computer IMO
          • opan2 小时前
            With soldered RAM and storage it seems quite risky to get the lowest spec version of any new Mac, so I don't see much point in discussing the entry-level price point. Do they still start at 8GB? I recall hearing that negatively impacted the SSD's performance significantly and people were recommending 16GB minimum for M1 stuff.
            • BolexNOLA2 小时前
              Base model is 16gb as of the M4 release.

              You also don’t have to get the base model. You can stay under $1000 while increasing to 24gb of ram.

            • cdelsolar2 小时前
              I have an 8GB M1 MacBook Pro and it’s the perfect computer. I always have lots of tabs and Docker open etc. It works great. I want to upgrade at some point but probably don’t need to anytime soon.
              • hylaride2 小时前
                8GB is fine (on macos anyways) for what you're doing, but like it or not more and more AI is being (unnecessarily) shoved into applications and it needs memory for the GPU.

                Memory (both system and GPU) is usually best thing to future proof a computer at buy time, especially as it's not user-replaceable anymore.

                • BolexNOLA2 小时前
                  You can get a 512g/24gb for $1104.00 USD. Still seems pretty great to me. If you’re disciplined with your internal/external storage you could also stay at a 256gb SSD and get 32gb of ram.
        • exe344 小时前
          my mid 2012 MacBook air refuses to die.
      • TigerAndDragonB2 小时前
        > ...how many people kept their Macbooks for nearly a decade; these devices are built to last...

        This is no longer true for me. I've been an Apple fan since the Apple ][ days, and reluctantly left the ecosystem last year. The hardware walled garden with soldered-on components and components tied down to specific units for ostensible privacy and security reasons (I don't buy those reasons), combined with the steadily degrading OS polish in fine attention to detail, for me personally, meant I could no longer justify the cognitive load to continue with a Mac laptop as my daily driver. While others might point to a cost or/or value differential, I'm in the highly privileged position to be insensitive to those factors.

        Last straw was an board-soldered SSD that quit well before I was willing to upgrade, and even Louis Rossman's shop said it would cost way more to desolder and solder a new one on than the entire laptop is worth. Bought a Framework the same day, when it arrived I restored my data files to it and been running it as my daily driver ever since. The Mac laptop is still sitting here, as I keep hoping to figure out when to find time to develop my wave soldering skills to try my hand at saving it from the landfill, or break down and unsustainably pay for the repair (I do what I can to avoid perpetuating dark patterns, but it is a Sisyphean effort).

        I found myself in a position of having to think increasingly more about working around the Mac ecosystem instead of working invisibly within it (like a fish in water not having to think about water), that it no longer made sense to stick with it. It has definitively lost the "It Just Works" polish that bound me so tightly to the ecosystem in the past. I see no functional difference in my daily work patterns using a Mac laptop versus a Framework running Fedora.

        To be sure, there are a lot of areas I have to work around on the Framework-Fedora daily driver, but for my personal work patterns, in my personal needs, I evaluated them to be roughly the same amount of time and cognitive load I spent on the Mac. Maybe Framework-Fedora is slightly worse, but close enough that I'd rather throw my hat into the more open ring than the increasingly closed walled garden Apple's direction definitely is taking us, that does not align with my vision for our computing future. It does not hurt that experimenting with local LLM's and various DevOps tooling for my work's Linux-based infrastructure is way easier and frictionless on Fedora for me, though YMMV for certain. It has already been and will be an interesting journey, it has been fun so far and brought back some fond memories of my early Apple ][, Macintosh 128K, and Mac OS X days.

      • TrainedMonkey6 小时前
        Underrated point, maybe it's aluminum unibody or more stable OS, but in my experience average MBP lifetime is meaningfully higher compared to a windows machine. My longest lasting windows machine was T400 Thinkpad which lasted 5 years before Core 2 Duo architecture stopped being able to keep up. It got replaced with HP Envy with great specs but made out of plastic which barely lasted 1.5 years before screen fell off (literally). Replaced with 17" 2014 MBP which is still alive after SSD replacement.
        • ho_schi4 小时前
          ThinkPad X220 here. From late 2012 until late 2023 in service with Linux. It is still usable but finally replaced it by an X13 Gen3 AMD with Linux. The magnesium body is a bless and feels a lot better to the skin than aluminium. The HiDPI display is the biggest upgrade. The six row keyboard is sturdy but a downgrade from the seven row keyboard. I miss the notch to open the lid.

          It got worse with Gen4/5 which now have an awful hump (reverse notch) like a smartphone.

          The long usage of the X220 depends on the built quality but also on the five year replacement part support. New batteries and a new palm rest (cracked during a journey). It not just quality you pay for, it is this support level. And of course more memory. Apple still fails in this regard and barely does something when forced by the European Union. Anyway - Apple doesn’t officially support Linux therefore I cannot buy them for work.

          This is the part wich saddens me, they do good work and the next MacBook will also not run fully with Linux. This kind of catchup things by hackers cannot be won - until the vendor decides you’re a valuable customer. Therefore, don’t buy them that you can run Linux. You maybe can. But these devices are made for macOS only.

          But if you want to run Linux on a MacBook? Talk your politicians! And send „messages“ with your money to Apple. Like buying ThinkPads, Dells Developer Edition, Purism, System 76 and so on :)

          • user_78324 小时前
            > The magnesium body is a bless and feels a lot better to the skin than aluminium.

            Just curious, how does it feel better? My framework apparently has an aluminium lid and a magnesium base, and the mg feels “smoother” than the slightly more textured al… however my iPad is apparently aluminium too and is smooth to the touch.

        • vladvasiliu4 小时前
          There's also the fact that the quality is all-round higher, making them more enjoyable to use. The current HP Elitebooks have much crappier screens than my 2013 MBP. Touchpads have improved, but they're still leagues behind that 11-year-old machine.

          I'm usually fairly careful with my things, so my gen8 hp elitebook still has all its bits together, but I've never really enjoyed using it. The screen, in particular, has ridiculous viewing angles, to the point it's impossible to not have any color cast on some region.

        • throw888885 小时前
          If you replaced the T400 because it felt slow, maybe it’s just a software/OS issue.

          The hardware on Thinkpad T-models should last longer than just 5 years in general.

          My daily-driver laptop at home is a T420 from 2011 with a Core 2 Duo, SSD and 8GB RAM. Works fine still.

          I run Linux + OpenBox, so it is a somewhat lightweight setup to be fair.

          • dm3194 小时前
            My daughter just asked me for the 'tiny' laptop. She has taken my Thinkpad X60 which runs linux mint. It's getting on for 20 years old soon!
          • bhouston4 小时前
            > My daily-driver laptop at home is a T420 from 2011 with a Core 2 Duo, SSD and 8GB RAM. Works fine still.

            I am not sure I would be productive with that. Any Core 2 Duo is 10x slower single core and 20x slower multi-core than a current generation laptop CPU at this point.

            Eg: https://browser.geekbench.com/v6/cpu/compare/8588187?baselin...

            I think it would mostly be good as an SSH terminal, but doing any real work locally on it seems frankly unfeasible.

            • pixelfarmer3 小时前
              The problem is software, though. I have a X200s with 4 GiB RAM from 2009. It was interesting to see how Firefox got slower and slower over the years. Granted, it not only is Firefox but also retard websites which use loads and loads of JS to display static content in the end. In return, it is not like JS didn't exist back then: The XhtmlRequest thingy for dynamic website updates or whatever the name for that was has been added years prior to that.

              So, yes, a lot of this comes down to software and a massive waste of cycles. I remember one bug in Electron/Atom where a blinking cursor caused like 10% CPU load or something along that line. They fixed it, but it tells you way more about how broken the entire software stack was at that time and it didn't get better since then.

              I mean, think about this: I used 1280x1024 on a 20" screen back in the mid 90ies on (Unix!) machines that are insanely less powerful than even this X200s. The biggest difference: Now you can move windows around visually, back then you moved the outer frame of it to the new place and then it got redrawn. And the formatting options in browsers are better, i.e. it is easier to design the layout you want. Plus there is no need for palette changes when switching windows anymore ("true color"). The overall productivity hasn't kept up with the increase in computing power, though. Do you think a machine 100x the performance will give you 100x the productivity? With some exceptions, the weak link in the chain were, are, and will always be humans, and if there are delays, we are talking almost always about badly "optimized" software (aka bloat). That was an issue back then already and, unfortunately, it didn't get better.

            • throw888882 小时前
              Horses for courses ¯\_(ツ)_/¯

              I do development and DevOps on it. Sure there are some intense workloads that I probably couldn’t run, but it works just fine as my daily driver.

              I also have a corporate/work laptop from Dell with 32GB RAM, 16 cores @ 4.x GHz etc. - a beast - but it runs Windows (+ antivirus, group policy crap etc.) and is slower in many aspects.

              Sure I can compile a single file faster and spin up more pods/containers etc. on the Dell laptop, but I am usually not constrained on my T420.

              I generally don’t spend much time waiting for my machine to finish things, compared to the time I spend e.g. writing text/code/whatever.

        • goosedragons1 小时前
          Wouldn't an equivalent Core 2 Duo Mac be just as bad if not worse in the same time frame due to Apple's constant OS updates?

          I have quad core W520 from 2011, it's still VERY serviceable for modern usage. Not the fastest, but it even has USB 3 and a 1080p screen which an equivalentish MBP from the time would have not.

        • Sakos4 小时前
          I have a T420 with a Core i5 2500 in great condition which still runs great on Windows 11. Core 2 Duo just didn't have the performance to have longevity. Sandy Bridge and later really do. Windows devices last for ages and it's really weird pretending this is an Apple only thing.
      • mjangle19854 小时前
        I still have an M1 Air. I love it.

        I considered upgrading but it’s hard to care to cause my M1 is just so good for what I need it for.

      • Neonlicht2 小时前
        Using a laptop with 8gb RAM for a decade is an exercise in frustration
        • bzzzt2 小时前
          Only if you insist on running software that needs more RAM, in which case you shouldn't have bought it.
          • Der_Einzige33 分钟前
            Apple should not have sold devices with 8GB or less back in 2018. Them doing it in 2024 is a sign that they think their users are idiots.
      • resource_waste3 小时前
        >how many people kept their Macbooks for nearly a decade

        Are your laptops not lasting 10 years? (battery swaps are a must though)

        The only reason I switched laptops was that I wanted to do AI Art and local LLMs.

        I have so many old laptops and desktops that each of my 5 kids have their own. They are even playing half-modern games on them.

      • hurutparittya3 小时前
        I'm always surprised when people speak highly of Apple devices here. While they do have certain advantages, there are some issues that should be dealbreakers for tech literate people. (in my own, possibly biased opinion at least)

        In case of Macbooks, it's the fact that they refuse to provide an official GPU driver for Linux and general poor support for things outside the walled garden. The Asahi stuff is cool and all, but come on, is a 3.4 trillion dollar company really going to just stand there and watch some volunteers struggling to provide support for their undocumented hardware without doing anything substantial to help? That sounds straight up insulting to me, especially for such a premium product.

        For iphones, it's the fact that you are not allowed to run your own code on YOUR OWN DEVICE without paying the Apple troll toll and passing the honestly ridiculous Apple Store requirements.

        And of course, in both cases, they actively sabotage third party repairs of their devices.

    • Dah00n1 小时前
      >Too many projects get dropped the moment the next shiny ray-traced toy comes along.

      Well.... (from the article):

      >"frankly, I think ray tracing is a bit of a gimmick feature"

      I couldn't agree more, on both counts.

    • patates7 小时前
      > what a reward when it works

      as someone who's been coding for more than 20 years, the happiest and the most depressed moments in my career both came during a hardware project I participated only for 4 months.

  • kristianp9 小时前
    I was going going to say she should work for Valve to help get Steam working on Linux on Macs, but it seems she already does? [1]

    [1] https://en.wikipedia.org/wiki/Alyssa_Rosenzweig#Career

    • jillesvangurp6 小时前
      Nice; that raises some interesting questions as to what Valve is planning here. Steam/Proton on macs would make a lot of sense for them hard as it may be. People booting linux on their macs to play games would really annoy Apple probably.
  • wwalexander2 小时前
    Alyssa Rosenzweig deserves a Turing Award!
  • skoczko6 小时前
    Since bringing modern OpenGL and Vulkan onto Apple Silicon is impossible without an emulation layer anyway, could, theoretically, a native Metal API for Linux be created? Or is Metal too ingrained in macOS SDKs? MoltenVK is attempting to solve the same issues Alyssa was talking about in her talk [1, the last comment on the issue is hers]

    [1] https://github.com/KhronosGroup/MoltenVK/issues/1524

    • Twisell5 小时前
      Nothing is barring Apple from supporting Vulkan natively on MacOS. This is essentially the closing statement of Alyssa Rosenzweig´s talk.

      With Apple knowledge of internal documents they are the best positioned to produce an even better low level implementation.

      At this point the main blockroad is the opinionated point that Metal porting is the only official supported way to go.

      If Valve pull up a witch-crafted way to run AAA games on Mac without Apple support that would be an interesting landscape. And maybe would force Apple to re-consider their approach if they don't want to be cornered on their own platform...

    • aseipp1 小时前
      I don't see why not. There are, after all, implementations of DirectX for Linux too, which is how Proton works. But I'm not sure if it would be better to build that API as a layer on top of Vulkan (completely "client side", like MoltenVK or dxvk do) or actually integrate it more deeply into Mesa. The first is certainly easier to start with, I guess.
  • gigatexal18 小时前
    Is anyone else astonished at how much is missing in the hardware and how much is emulated?
    • dtquad17 小时前
      The graphics pipeline in modern GPUs are mostly a thin low-level Vulkan/Metal-like layer on top of a massively parallel CUDA-like compute architecture.

      It's basically all emulated. One of the reasons GPU manufacturers are unwilling to open source their drivers is because a lot of their secret sauce actually happens in software in the drivers on top of the massively parallel CUDA-like compute architecture.

      • exDM693 小时前
        This statement isn't true at all. There are tons of fixed function hardware units for graphics in GPUs in addition to compute cores: triangle rasterizers, texture units, raytracing units, blitters, copy engines, video codecs, etc. They interact with the shader/compute cores and it's getting more common that the shader core is driving the rasterizer etc than vice versa (mesh shaders and ray tracing for example).

        Calling it "all emulated" is very very far from the truth.

        You can independently verify this by digging into open source graphics drivers.

      • david-gpu7 小时前
        As a former insider, this is NOT remotely how I would describe reality.

        I have signed NDAs and don't feel comfortable going into any detail, other than saying that there is a TON going on inside GPUs that is not "basically all emulated".

        • user_78324 小时前
          Maybe a tall ask, but have you considered writing down all your experiences into a book, and release it after all NDAs expire? I’d love to read more about low level hardware and the behind the scenes stuff involved. I’m sure you’ve got a lot of good stories.
          • snewman2 小时前
            NDAs don't generally have an expiration date. (As opposed to non-competes and non-solicitation agreements, which generally do.) An NDA typically ends only if the information in question becomes public, and then only for that particular information.
          • almostgotcaught3 小时前
            > I’d love to read more about low level hardware and the behind the scenes stuff involved. I’m sure you’ve got a lot of good stories.

            I don't know why you think there's anything resembling "good stories" (I don't even know what would constitute a good story - swash buckling adventures?). It's just grimy-ass runtime/driver/firmware code interfacing with hardware features/flaws.

      • samus1 小时前
        A lot of it is software, but not necessarily in the driver. Nouveau folks pretty much gave up and use NVidia's firmware blob going forward. While that's mostly due to NVidia not cooperating in making crucial capabilities of the GPU available to non-signed firmware blobs, the upside is that it will hopefully significantly reduce the effort connected with wiring up new hardware components on the GPU.
      • almostgotcaught15 小时前
        I didn't read the article and don't know about Apple but that's definitely not true for everyone. Source: see amdgpu built on top of HSA.

        EDIT: to be precise yes ofc every chip is a massively parallel array of compute units but CUDA has absolutely nothing to do with it and no not every company buries the functionality in the driver.

        • hilti10 小时前
          [flagged]
    • jsheard18 小时前
      The things being emulated are mostly legacy features that are barely used in modern software, if at all, so the overhead of emulating them for backward compatibility isn't the end of the world. I can't blame Apple for not supporting geometry shaders in hardware, when they're widely considered to be a mistake that never should have been standardized in the first place, and Metal never supported them at all so they could only ever come up in old OpenGL code on macOS.

      https://x.com/pointinpolygon/status/1270695113967181827

      • parl_match18 小时前
        I wouldn't go so far as to say "mistake that should never have been standardized". Their intended use was always pretty limited, though. There's zero reason for anything built in recent memory to use them.
        • jsheard18 小时前
          They had limited uses and turned out to be incredibly hard to implement efficiently in hardware, so in practice it was nearly always faster to just keep using the proven techniques that GS was supposed to replace.

          http://www.joshbarczak.com/blog/?p=667

          • RantyDave16 小时前
            So why does instancing suck? I would have thought it would be heavily optimised in the driver...
            • p_l6 小时前
              It seems Device Generated Commands might be better case for instancing these days?
          • comex15 小时前
            And yet one of the fancy new features being advertised in recent years (in multiple APIs including Metal) is support for mesh shaders – which seem to have a lot in common with geometry shaders, including the output ordering property that that post blames for geometry shaders’ bad performance. I’m not a graphics programmer myself, but this makes me suspect there’s more to the story.
            • winterismute3 小时前
              I am not an expert on geometry processing pipelines, however Mesh Shaders are specced differently from GS, essentially one of the big problems with GS is that it's basically impossible for the HW, even after all the render state is set and a shader is bound and compiled (and "searched"), to understand how much memory and compute the execution will take, which breaks a lot of the assumptions that allow SIMD machines to work well. In fact, the main advertised feature of GS was to create geometry out of nothing (unbounded particle effects), while the main advertised feature of Mesh Shaders is GPU-driven and efficient culling of geometry (see for example the recent mesh shader pipeline talk from Remedy on Alan Wake 2). It is true that Mesh Shaders are designed also for amplification, and that word has been chosen specifically to hint that you will be able to "multiply" your primitives but not generating random sequences out of thin air.

              It is also true however that advances in APIs and HW desgins allowed for some parts that were troublesome at the time of GS not to be so troublesome anymore.

            • dgfitz12 小时前
              If you’re not a graphics programmer, how did you learn of this? I’d love to read about it.
            • HeuristicsCG11 小时前
              if your hardware supports mesh shaders properly, it won't be very hard for it to also support these other features emulated in software (geometry shaders, tessellation, lines etc).

              But mesh shaders are fairly new, will take a few years for the hardware and software to adapt.

              • adrian_b2 小时前
                They are fairly new at the other GPU makers, but the NVIDIA GPUs have them starting with Turing, 6 years ago.

                AMD GPUs have them starting with RDNA 2.

      • raverbashing13 小时前
        Tessellation does not seem to fit this description though
    • tedunangst17 小时前
      Is this really so different from any other mobile derived GPU?
      • ferbivore17 小时前
        Yes. Apple have their own graphics API. They were able to decide that, say, geometry shaders aren't worth the chip area or engineering effort to support. Other IHVs don't get that choice; for geometry shaders, for instance, they're part of both Vulkan and OpenGLES, there are games and benchmarks that use them, and customers (end-users, gamedevs, review/benchmark sites, SoC vendors) will evaluate GPUs based, in some small part, on how good their geometry shader support is. Same story for tessellation, transform feedback, and whatever else Apple dropped.
        • dagmx16 小时前
          Other hardware vendors absolutely do the same. Geometry shaders are poor performers on several other vendors for precisely the same reason
          • ferbivore16 小时前
            At least some Qualcomm, Imagination and Broadcom GPUs support geometry shaders in hardware. Not entirely sure about Arm. To be fair, it could be the support isn't very good.
            • dagmx14 小时前
              Some do, but a lot will have their firmware actually do the translation into other compute types that align better with the hardware.
            • 11 小时前
              undefined
        • rstat113 小时前
          >>(end-users, gamedevs, review/benchmark sites, SoC vendors) will evaluate GPUs based, in some small part, on how good their geometry shader support is

          Do they? I can't remember ever seeing any mention of Geometry Shader performance in a GPU review I've read/watched. The one thing I've ever heard about it was about how bad they were.

          • lewispollard7 小时前
            Yeah, geometry shaders aren't even widely used, they weren't great in the first place and devs have moved on.
        • fingerlocks17 小时前
          I don’t think it’s accurate to say Apple “dropped” these features. Tessellation is done with general purpose compute shaders. You can do the same with “geometry shaders” if the need arises as well
      • refulgentis17 小时前
        Idk, TIL, had a career in mobile for 15 years running and I didn't know this was a distinctive quality of mobile GPUs. (makes sense! but all that to say, I'm very interested to hear more, and I'll trade you an answer to that question: "maybe not! sounds like you got some smart stuff to share :)")
  • gcr15 小时前
    I’ve been trained to expect articles with this headline to say something like “we’re dropping support and are getting acqui-hired.”
  • UncleOxidant19 小时前
    Will M3/M4 need completely different drivers?
    • hi_hi17 小时前
      I think the answer is yes. I'm making assumptions based on this part of Alyssas talk a couple of weeks ago where she talks about M3 having specific driver support for raytracing which doesn't exist in previous versions.

      https://youtu.be/pDsksRBLXPk?t=2895

      The whole thing is worth watching to be honest, it's a privilege to watch someone share their deep knowledge and talent in such an engaging and approachable way.

      • olliej15 小时前
        ooh, I missed this, thanks for the link!
    • wmf19 小时前
      Probably. Apple made major changes to the GPU in M3.
      • coldtea18 小时前
        Major base changes, or just added more stuff on top of the same base?
        • ferbivore16 小时前
          M3 has mesh shader support. The geometry pipeline they inherited from PowerVR fundamentally doesn't support them, for reasons that go way over my head. They probably changed a good chunk of it.
          • olliej15 小时前
            > for reasons that go way over my head

            In fairness to you I think a lot of the stuff involving hardware goes over everyone's heads :D

            I've seen comments in a number of articles (and I think a few comments in this thread) saying that there are a few features in Vulcan/opengl/direct3d that were standardized ("standardized" in the D3D case?)/required that turned out to be really expensive to implement, hard to implement fast in hardware anyway, and not necessarily actually useful in practice. I think geometry shaders may have been one of those cases but I can't recall for sure.

            • ferbivore15 小时前
              Mesh shaders are actually useful. Or at least game engine people love them, which was not the case for geometry shaders or even tessellation really. They are extremely painful to add support for though. Aside from Apple I don't think any mobile IHVs have a working implementation.
        • wtallis18 小时前
          https://forum.beyond3d.com/threads/apple-dynamic-caching-on-... Changing the register file into a cache sounds like a major base change. Raytracing is a major feature added on top of existing functionality. So I'd say the answer is: plenty of both.
        • sroussey16 小时前
          How it handles memory and registers is quite different.
          • TylerE13 小时前
            How much will this matter to (somewhat graphics demanding) end users? I'm somewhat eagerly awaiting swapping out my M1 Studio with the M4 Studio that is all but confirmed to be coming at some point next year... More GPU grunt would certainly make me happy. Even the M1 is a far more competent gaming machine than I expected but I came from an i9/3080 machine so, well, more is more, as long as they can keep it near silent and relatively cool running.
      • a_wild_dandan18 小时前
        What were the major differences?
    • hellavapid10 小时前
      'twould be very apple
  • kachapopopow18 小时前
    I always wondered about these /SubscriberLink/ links. Is sharing them considered unethical?
    • anamexis18 小时前
      From the LWN FAQ:

      > Where is it appropriate to post a subscriber link?

      > Almost anywhere. Private mail, messages to project mailing lists, and blog entries are all appropriate. As long as people do not use subscriber links as a way to defeat our attempts to gain subscribers, we are happy to see them shared.

    • zarzavat15 小时前
      I'm sure that they are very, very happy that HN is directing a firehose of traffic at their site on the regular.

      In order to subscribe people need to know that LWN exists.

      • jlarocco13 小时前
        I know this is off on a tangent, but I recently signed up for LWN and it's well worth the price. The articles and news items alone are worth it, but the archive is just amazing.
        • pimeys8 小时前
          Same here. I originally found LWN through a HN post. I've been a subscriber for a few years now, and I'm reading almost everything they publish, even if I'm not always understanding everything they talk about. The quality of writing is very high.
    • sophacles18 小时前
      Representatives of LWN have posted here before saying they are OK with it, along with a polite request not to have it go overboard since they need to fund the site and writers, editors, etc. That funding comes from subscriptions only IIUC.

      FWIW an LWN subscription is pretty affordable and supports some of the best in-depth technical reporting about Linux and linux-related topics available.

      (I am not affiliated with LWN, just a happy subscriber - I also credit some of my career success to the knowledge I've gained by reading their articles).

      • ewoodrich17 小时前
        A couple times now I’ve remembered to refill my subscription after my old one ran out of months thanks to an LWN free article being posted to HN.

        So n=1 it’s an effective advertising tactic even though I can read the specific article for free.

      • awsrhawrhawrh15 小时前
        Funding also comes from sponsors, as the article obviously states:

        "I would like to thank LWN's travel sponsor, the Linux Foundation, for travel assistance to Montreal for XDC."

    • cbhl18 小时前
      The purpose of these links is to be shared, see the FAQ: https://lwn.net/op/FAQ.lwn#slinks

      The LWN paywall is unique in that all the content becomes freely available after a week. The subscriber links are there to encourage you to subscribe if you are in a position to do so.

    • vintagedave18 小时前
      It says,

      > The following subscription-only content has been made available to you by an LWN subscriber.

      I might be wrong but I read that as there being funding to make the previously paywalled content available, probably on an article-specific basis. Does anyone know?

      • anamexis18 小时前
        Also from the LWN FAQ:

        > What are subscriber links

        > A subscriber link is a mechanism by which LWN subscribers may grant free access to specific LWN articles to others. It takes the form of a special link which bypasses the subscription gate for that article.

    • 18 小时前
      undefined
    • flykespice17 小时前
      It's not different from shared articles here from subscriber-only newsletters being accompanied by the archived link in the topmost comment.

      Users here seem to not care about those "ethics"

      • wtallis15 小时前
        There's nothing ethically wrong about sharing LWN subscriber links on HN when the LWN editor has repeatedly posted on HN that doing so is fine.
      • samus1 小时前
        It's really not such a big deal since everything on LWN becomes public after some weeks. Rather the opposite, it ensures that the site remains well-known.
      • rstat113 小时前
        There's a difference between posting a sub-only link to that's intended to be shared in moderation, and posting a link to pay walled article as if to expect the clicker of that link to pay for a sub to that website just so they can read that article.

        Its a pretty absurd expectation.

  • scottlamb18 小时前
    > tessellator.cl is the most unhinged file of my career

    ...so far. The presenter is only 23 apparently. Maybe I'm speaking only for myself here, but I think career unhingedness does not go down over time as much as one might hope.

    In all seriousness, she does really impressive work, so when she says this 2,000 lines of C++ is inscrutable, that gives one pause. Glad it's working nonetheless.

    • nwallin17 小时前
      It's unfathomable that she's 23 and wrote an open source graphics driver for a closed-source graphics card. At least Einstein had the decency to wait until he was 26 to invent special relativity.
      • TheCycoONE13 小时前
        Multiple, before this she wrote panfrost for ARM mali GPUs starting in 2018.
      • 13 小时前
        undefined
    • Dah00n1 小时前
      >The presenter is only 23 apparently

      Yes, well, from the article:

      >That works, but ""tessellator.cl is the most unhinged file of my career""; doing things that way was also the most unhinged thing she has done in her career ""and I'm standing up here in a witch hat for the fifth year in a row"". The character debuted in the exact same room in 2019 when she was 17 years old, she recalled.

      17. That's impressive.

    • jjmarr12 小时前
      I didn't even know this until now. I looked at her resume and I can't grasp how she's been doing an undergraduate degree from 2019-2023, meaning that writing an entire GPU driver is the side project she made during her degree.
      • I_AM_A_SMURF10 小时前
        For gifted people undergraduate degrees are easy (unless you want to go through it really fast). Some people I know barely studied for their exams and spent all their time in research projects / their own thing.
      • saagarjha9 小时前
        There’s a lot of free time when you’re in college.
    • dividuum18 小时前
      • worstspotgain15 小时前
        Is there a blog post about what specifically she found to be inscrutable? The C++ doesn't look all that terse at a syntactic level, and has plenty of comments. Are the problems at the domain level?
        • samus1 小时前
          I can imagine she was glad it took only some gentle massaging to make it work. If she had encountered road blocks there, she would have had to dig in for real.

          The code is quite low on comments and doesn't really explain the math there. It probably makes sense if you have background in the lofty math side of computer graphics, but that's a slightly different skillset than being able to reverse-engineer and bring up exotic hardware.

        • raggi8 小时前
      • raverbashing9 小时前
        Honestly it doesn't look too bad

        A lot of the size is just because the code deals with a lot of 3/4 dim stuff and also some things are a bit more verbose in code but translate to something short in assembly

      • whatever118 小时前
        She is not doing CI/CD so the PMs would fire her in any modern company. "Where are the unit tests?!? How is it possible to write code without tests and 100% coverage!"
        • wmf17 小时前
          OpenGL and Vulkan have tons of tests and Alyssa runs them. I don't know if it's automated through CI.
          • manmal16 小时前
            Reportedly, the lack of M3 support so far is because there is no M3 Mac Mini which they can use for CI.
            • fl0id15 小时前
              That was one reason to delay it, but another more important they said was to implement gpu things like this and other missing features.
            • spockz11 小时前
              I’m sure if that is the issue we can together find a way to sponsor a MacBook with an M3? Or does it specifically have to be a Mac mini?
  • egwor17 小时前
    Really impressive. Well done (and thanks for the laughs. Starting in French would be so funny)
  • renewiltord19 小时前
    The work by Alyssa R and Asahi Lina is great stuff. I have to say that a lot of this is really inscrutable unless you’re used to driver code. I wish it were much easier to write this stuff but hardware stuff is so idiosyncratic.

    Have to say I do enjoy all the old school style whimsy with the witch costume and whatnot.

    • hi_hi17 小时前
      I've just been watching her recent talk. I noticed she appears to change slides with a wave of her wand. Is there an known piece of hardware one can purchase to do this?

      I tried googling, but trying to find the specific result I'm interested in amongst all the blog spam garbage related to powerpoint is beyond me. Even googles own AI couldn't help. Sad times!

      • bryanlarsen17 小时前
        Purchasing one has been illegal since 1863, but the technology typically used is a human assistant, although the signal is usually more subtle.
      • jeroenhd7 小时前
        In this case there's probably a human doing the slides, but smart watches have apps that can control slideshows in a variety of ways. There's at least one WearOS app (WowMouse) that does some gesture based presentation control.
      • sen17 小时前
        She might be doing some OpenCV/MedaPipe gesture tracking? There's lots of tutorials out there and it's not super difficult to get some basic gesture tracking going on your laptop.
      • m4rtink7 小时前
        Maybe some ESP32 or other MCU and accelerometer ? That should comfortably fit into a magic wand.
      • sroussey15 小时前
        Ug… a person is watching and clicking for her when she does that?
      • Onavo14 小时前
        Is Leap Motion still a thing?
    • dylan60419 小时前
      what they do is just this side of magic, so maybe it's not really a costume?
      • amelius18 小时前
        Any sufficiently closed technology is indistinguishable from magic.

        In fact, the company is so closed that Rosenzweig is the one we should consult when we encounter aliens.

        • dylan60418 小时前
          I don't think the work they do is what one could call closed though is it?
    • gjsman-100018 小时前
      [flagged]
      • Teever18 小时前
        Is that actually the purpose though? The appearance of diversity?

        Maybe marcan42 just wants to present themselves that way.

        • TimTheTinker17 小时前
          [flagged]
        • userbinator16 小时前
          [flagged]
        • gjsman-100018 小时前
          [flagged]
          • 18 小时前
            undefined
          • colechristensen18 小时前
            Are you being offended for yourself or on behalf of other people? If the latter, please leave that for others to judge for themselves.
            • talldayo18 小时前
              There's something rich about this gjsman showing up to defend Apple as an unpaid lackey, and inventing controversy to justify hating the opposition. I'd say something like "I wonder what it's like inside this user's head" but it seems like their deepest insecurities have been made quite external by now.
              • colechristensen17 小时前
                Here it is best to not engage or add any volume to discussions like this. How you get a community that isn't infested with shallow controversy is by not giving it any attention but small amounts of gentle discouragement.
            • gjsman-100018 小时前
              [flagged]
      • renewiltord18 小时前
        Yeah, the culture war stuff is a big deal for people who don’t write code. But I don’t think their opinions are that important for Linux kernel. Pretty effective to have it as honeypot. Allows for easy programmatic filter out for non-coders.
      • slekker18 小时前
        Is that a fact or are you speculating?
        • tredre317 小时前
          Marcan42 accidentally posted on his twitter as Asahi Lina, and on screen Lina has accidentally shown things identifiable to Marcan (usernames, logged in mastodon account).

          But he clearly doesn't want to be linked/equated to her for whatever reason, so I don't know why GP brought this up beyond stirring up drama.

        • stepupmakeup18 小时前
          There was drama with Lina and another person in the vtuber hacker space who directly referred to them as marcan.
        • gjsman-100018 小时前
          [flagged]
      • MrBuddyCasino18 小时前
        [flagged]
  • olliej16 小时前
    Alyssa is amazing, I remember the first article about the GPU work, and then learning she was only 17 and poof mind blown.

    It's truly stunning that anyone could do what she did, let alone a teenager (yes I know, she's not a teenager anymore, passage of time, etc :D)

    • Daz19 小时前
      [flagged]
  • recvonline16 小时前
    Any link to the fact that the drivers are written in Rust?
  • computersuck16 小时前
    That's not even a costume because she's definitely a wizard
    • indrora13 小时前
      (:
    • 4 小时前
      undefined
    • mlindner7 小时前
      [flagged]
      • cariaso7 小时前
        Alyssa Rosenzweig
        • mlindner7 小时前
          [flagged]
          • m2fkxy6 小时前
            your point being?
            • lemper6 小时前
              enough bro. don't feed the troll.
  • m46310 小时前
    "our target hardware is running literally none of those things". What is needed is to somehow translate DirectX to Vulkan, Windows to Linux, x86 to Arm64, and 4KB pages to 16KB pages.

    oh my.

    • 4 小时前
      undefined
  • trallnag7 小时前
    [flagged]
  • adastra2216 小时前
    > frankly, I think ray tracing is a bit of a gimmick feature

    That's incredibly arrogant. The whole industry is adopting ray tracing, and it is a very desired feature people are upgrading video cards to get working on games they play.

    • nfriedly16 小时前
      I don't know, Hardware Unboxed recently did a review of 30-some games comparing the visuals with and without ray tracing, and the conclusion for the majority of them was that it wasn't worth it. There were only maybe half a dozen games where the ray tracing was an unambiguous improvement.

      So I think calling it "a bit of a gimmick" is accurate for many of the games that shipped in, even if not all of them.

      • wtallis15 小时前
        It's also a pretty large performance hit, even when the quality improvement is subtle. A major and unfortunate consequence is that it has driven even more games to rely on upscaling and heavy post-processing, all of which tend to blur the output image and introduce artifacts and add latency.
      • jms5515 小时前
        You're looking at years of careful artist and engineer work to get something that looks almost as good as pathtraced visuals. The fact that it's so good without any raytracing is a credit to the developers.

        Replacing all that effort with raytracing and having one unified lighting system would be a _major_ time saver, and allows much more dynamic lighting than was previously possible. So yeah some current games don't look much better with RT, but the gameplay and art direction was designed without raytracing in mind in the first place, and had a _lot_ of work put into it to get those results.

        Sure fully pathtraced graphics might not be 100% usable currently, but the fact that they're even 70% usable is amazing! And with another 3-5 years of algorithm development and hardware speedups, and developers and artists getting familiar with raytracing, we might start seeing games require raytracing.

        Games typically take 4+ years to develop, so anything you're seeing coming out now was probably started when the best GPU you could buy for raytracing was an RTX 2080 TI.

        • 54 分钟前
          undefined
        • wtallis14 小时前
          > Sure fully pathtraced graphics might not be 100% usable currently, but the fact that they're even 70% usable is amazing!

          Aside from Cyberpunk 2077 and a handful of ancient games with NVIDIA-sponsored remakes, what even offers fully path traced lighting as an option? The way it went for CP2077 makes your "70% usable" claim seem like quite an exaggeration: performance is only good if you have a current-generation GPU that cost at least $1k, the path tracing option didn't get added until years after the game originally shipped, and they had to fix a bunch of glitches resulting from the game world being built without path tracing in mind. We're clearly still years away from path tracing being broadly available among AAA games, let alone playable on any large portion of gaming PCs.

          For the foreseeable future, games will still need to look good without fully path traced lighting.

          • jms5511 小时前
            > performance is only good if you have a current-generation GPU that cost at least $1k

            That's why I said games aren't currently designed with only pathtracing in mind, but in 3-5 years with faster hardware and better algorithms, we'll probably start to see it be more widespread. That's typically how graphics usually develop; something that's only for high end GPUs eventually becomes accessible to everyone. SSAO used to be considered extremely demanding, and now it's accessible to even the weakest phone GPU with good enough quality.

            Again the fact that it's feasible at all, even if it requires a $1000 GPU, is amazing! 5 years ago real time path tracing would've been seen as impossible.

            > The way it went for CP2077 makes your "70% usable" claim seem like quite an exaggeration

            Based on the raw frame timing numbers and temporal stability, I don't think it is. RT GI is currently usually around ~4ms, which is at the upper edge of usable. However the temporal stability is usually the bigger issue - at current ray counts, with current algorithms, either noise or slow response times is an inevitable tradeoff. Hence, 70% usable. But with another few years of improvements, we'll probably get to the point where we can get it down to ~2.5ms with the current stability, or 4ms and much more stable. Which would be perfectly usable.

            • wtallis10 小时前
              > RT GI is currently usually around ~4ms, which is at the upper edge of usable. However the temporal stability is usually the bigger issue - at current ray counts, with current algorithms, either noise or slow response times is an inevitable tradeoff. Hence, 70% usable. But with another few years of improvements, we'll probably get to the point where we can get it down to ~2.5ms with the current stability, or 4ms and much more stable. Which would be perfectly usable.

              Maybe you should be saying 70% feasible rather than 70% usable. And you seem to be very optimistic about what kind of improvements we can expect for affordable, low-power GPU hardware over a mere 3-5 years. I don't think algorithmic improvements to denoisers and upscalers can get us much further unless we're using very wrong image quality metrics to give their blurriness a passing grade. Two rays per pixel is simply never going to suffice.

              Right now, an RTX 4090 ($1700+) runs at less than 50 fps at 2560x1440, unless you lie to yourself about resolution using an upscaler. So the best consumer GPU at the moment is about 70-80% of what's necessary to use path tracing and hit resolution and refresh rate targets typical of high-end gaming in 2012.

              Having better-than-4090 performance trickle down to a more mainstream price point of $300-400 is going to take at least two more generations of GPU hardware improvements even with the most optimistic expectations for Moore's Law, and that's the minimum necessary to do path tracing well at a modest resolution on a game that will be approaching a decade old by then. It'll take another hardware generation for that level of performance to fit in the price and power budgets of consoles and laptops.

          • porphyra11 小时前
            UE5 supports fully path traced lighting and Black Myth Wukong is a recent example.
          • paavohtl9 小时前
            Alan Wake 2 has had path tracing from day 1, and it looks excellent with (and without) it.
          • talldayo11 小时前
            > For the foreseeable future, games will still need to look good without fully path traced lighting.

            And in 7-10 years when the software stack is matured, we'll be thanking ourselves for doing this in hardware the right way. I don't understand why planning for the future is considered so wasteful - this is an architecture Apple can re-use for future hardware and scale to larger GPUs. Maybe it doesn't make sense for Macs today, but in 5 years that may no longer be the case. Now people don't have to throw away a perfectly good computer made in these twilight years of Moore's law.

            For non-games applications like Blender or Cinema4D, having hardware-accelerated ray tracing and denoising is already a game-changer. Instead of switching between preview and render layers, you can interact with a production-quality render in real time. Materials are properly emissive and transmissive, PBR and normal maps composite naturally instead of needing different settings, and you can count the time it takes before getting an acceptable frame in milliseconds, not minutes.

            I don't often give Apple the benefit of the doubt, but hardware-accelerated ray tracing is a no-brainer here. If they aren't going to abandon Metal, and they intend to maintain their minuscule foothold in PC gaming, they have to lay the groundwork for future titles to get developed on. They have the hardware investment, they have the capital to invest in their software, and their competitors like Khronos (apparently) and Microsoft both had ray tracing APIs for years when Apple finally released theirs.

            • rowanG07711 小时前
              I think you have the wrong impression. Apple M2 does not have hw ray tracing. M3 and M4 do.
      • porphyra11 小时前
        Hmm I watched the video [1] and in the games where it had an unambiguous improvement, the improvement in reflection quality is extremely stark and noticeable. Anecdotally, when I was playing Black Myth Wukong recently, there were some shiny environments where full path tracing really made a huge difference.

        So I guess it's just a "gimmick" in that relatively few games properly take advantage of this currently, rather than the effect not being good enough.

        [1] https://www.youtube.com/watch?v=DBNH0NyN8K8

        • adastra229 小时前
          Using raytracing on a very recent, high end game that has been designed for raytracing has an extremely noticeable positive improvement on visual quality. Wukong is a perfect example.

          Unfortunately most people’s experience with raytracing is turning it on for a game that was not designed for it, but it was added through a patch, which results in worse lighting. Why? Because the rasterized image includes baked-in global illumination using more light sources than whatever was hastily put together for the raytracing patch.

          • IntelMiner8 小时前
            The first game I encountered with RayTracing support was World of Warcraft "Classic"'s Burning Crusade launch a couple years back.

            WoW Burning Crusade launched in *2006* originally. The "Classic" re-release of the game uses the modern engine but with the original game art assets and content.

            Does it do anything in the 'modern' WoW game? Probably! In Classic though all it did was tank my framerate.

            Since then I also played the unimaginable disaster that was Cyberpunk 2077. For as "pretty" as I suppose the game looked I can't exactly say if the ray tracing improved anything

    • pxc1 小时前
      > it is a very desired feature people are upgrading video cards to get working on games they play.

      Ray tracing is also extremely heavily marketed, and at a time when GPU upgrades have become infrequent for most people due to extremely high prices and relatively weak overall performance gains year over year.

      The industry is adopting ray tracing because they need to be able to offer something new when the overall performance gains generation over generation have been slowing for years. They've been leveraging marketing hard to try to generate demand for upgrades among gamers, knowing cryptocurrency won't buoy them forever.

      That ray tracing is cited as a reason among gamers who do upgrade their GPUs is unsurprising and not really a strong indicator that ray tracing is a great feature.

      At any rate, both the gaming industry and the wider tech industry are known to be extremely faddish at times. So is consumer behavior. Things 'the whole industry is doing' has always been a category that includes many gimmicks.

    • ZiiS5 小时前
      Not sure if I can process "incredibly arrogant" in relation to the Herculean work and insight demonstrated.

      However, it is important to put things in context. Something can be a 'Gimmick' on several year old integrated mobile hardware that can't run most games at a reasonable FPS without it; and not a 'Gimmick' on cutting edge space heaters.

    • kllrnohj15 小时前
      The M1-M4 GPUs are also nowhere close to fast enough for it to be useful. Just like Snapdragon having ray tracing support is ridiculous.
    • MBCook16 小时前
      That doesn’t necessarily mean it’s important, just that it’s the only thing they can think of to sell. Much like some AI stuff we’re seeing right now.

      Personally I think it’s useful for a few things, but it’s not the giant game changer I think they want you to think it is.

      Raytraced reflections are a very nice improvement. Using it for global illumination and shadows is also a very good improvement.

      But it’s not exactly what the move to multi texturing was, or the first GPUs. Or shaders.

    • zamadatix13 小时前
      While it may eventually be a non-gimmick in the industry as the compute needed for it advances more it's always going to be quite the gimmick in terms of what the M1/M2 GPU can do with it.
    • musictubes13 小时前
      Can the hardware for ray tracing be used for anything else? Ray tracing is just math, is that math applicable for any non graphics use?
      • exDM693 小时前
        > Ray tracing is just math, is that math applicable for any non graphics use?

        It's not "just math", it's data structures and algorithms for tree traversal, with a focus on memory cache hardware friendliness.

        The math part is trivial. It's the memory part that's hard.

        Ray tracing hardware and acceleration structures are highly specific to ray tracing and not really usable for other kinds of spatial queries. That said, ray tracing has applications outside of computer graphics. Medical imaging for example.

      • boulos12 小时前
        You'll have to see what the instruction set / features are capable of, but most likely the "hardware ray tracing" support means it can do ray-BVH and ray-triangle intersection in hardware. You can reuse ray-box and ray-triangle intersection for collision detection.

        The other parts of ray tracing like shading and so on, are usually just done on the general compute.

        • kllrnohj3 小时前
          > You can reuse ray-box and ray-triangle intersection for collision detection.

          The problem with doing so, though, and with GPU physics in general is that it's too high latency to incorporate effectively into a typical game loop. It'll work fine for things that don't impact the world, like particle simulations or hair/cloth physics, but for anything interactive the latency cost tends to kill it. That and also the GPU is usually the visual bottleneck anyway so having it spend power on stuff the half-idle CPU could do adequately isn't a good use of resources.

      • flohofwoe8 小时前
        It should be useful for anything that involves stabbing checks on large triangle soups. Whether it's worth "wasting" die space for this is debatable of course.
    • flohofwoe8 小时前
      ...and yet the most popular use case for raytracing seems to be more correct reflections in puddles ;)
    • madeofpalk16 小时前
      The same was true about 3D TVs. Also a gimmick.
      • talldayo15 小时前
        3D TVs failed before they existed as a product category. You cannot encode enough information through stereoscopy to create a convincing or "realistic" 3D effect. The conditions for making it work have always been fickle and as 6DOF VR/AR is proving, it was never enough to be convincing anyways. Encoding two slightly different versions of a video and layering them on top of each other is a dirty hack. It is not a holistic solution to a problem they seriously intended to solve in the first place. It is a novelty designed from the bottom-up as a scarcely-usable toy.

        Ray tracing is, by comparison, the holy grail of all realtime lighting effects. Global illumination makes the current raster lighting techniques look primitive by comparison. It is not an exaggeration to say that realtime graphics research largely revolves around using hacks to imitate a fraction of ray tracing's power. They are piling on pipeline-after-pipeline for ambient occlusion, realtime reflections and shadowing, bloom and glare as well as god rays and screenspace/volumetric effects. These are all things you don't have to hack together when your scene is already path tracing the environment with the physical properties accounted for. Instead of stacking hacks, you have a coherent pipeline that can be denoised, antialiased, upscaled and post-processed in one pass. No shadowmaps, no baked lighting, all realtime.

        There is a reason why even Apple quit dragging their feet here - modern real-time graphics are the gimmick, ray tracing is the production-quality alternative.

        • adastra2211 小时前
          Thank you. This guy gets it.

          Path tracing is the gold standard for computer graphics. It is a physically based rendering model that is based on how lighting actually works. There are degrees of path tracing of varying quality, but there is nothing else that is better from a visual quality and accuracy standpoint.

          Your modern AAA title does a massive amount of impressive hacks to get rasterization into the uncanny valley, as rasterization has nothing to do with how photons work. That can all be thrown out and replaced with “model photons interacting with the scene” if the path tracing hardware was powerful enough. It’d be simpler and perfectly accurate. The end result would not live in the uncanny valley, but would be indistinguishable from reality.

          Assuming the hardware was fast enough. But we’ll get there.

          • flohofwoe8 小时前
            Think of the 'impressive hacks' used in realtime 3D rendering as compression techniques to reduce bandwidth (and this goes all the way back to 8-bit home computers with their tricky video memory encoding, this was essentially hardware image compression to reduce memory bandwidth).

            Why waste computing power on 'physically correct' algorithms, when the 'cheap hacks' can do so much more in less time, while producing nearly the same result.

            • talldayo19 分钟前
              Because a lot of the "cheap hacks" aren't even cheap. Famously, using real-time SSR to simulate ray-traced reflections can end up slower than ray tracing them from the jump, the same goes for high-res shadowmaps. Ambient occlusion is a lighting pass you can skip entirely if you render shadows the right way with RT from the start. If you keep stacking these hacks until you have every feature that a RT scene does, you're probably taking longer to render each frame than a globally illuminated scene would.

              Accelerating ray tracing in-hardware is a literal no-brainer unless you deliberately want to ostracize game developers and animators on Mac. I understand the reactionary "But I don't care about dynamic shadows!" opinion, but there's practically no opportunity cost here. If you want to use traditional lighting techniques, you can still render it on an RT-enabled GPU. You just also have the option of not wanting to pull out your fingernails when rendering a preview in Blender.

          • Nullabillity10 小时前
            This is all built on the, frankly, nonsensical assumption that photorealism is the, let alone a, primary goal.
            • 6 小时前
              undefined
            • adastra229 小时前
              In most cases of high end / AAA game titles, it is. Not all, but certainly most.
            • talldayo30 分钟前
              Not really. Tiny Glade is a recent ray-traced title that has no goal of photorealism whatsoever, and it looks gorgeous with RT lighting and shadows: https://youtu.be/QAUSBxxgIbQ

              You can scale these same principles to even less photorealistic games like the Borderlands series or a Grand Theft Auto game. Ray tracing is less about photorealism (although it is a potent side effect) and more about creating dynamic lighting conditions for an interactive game. By simulating the behavior of light, you get realistic lighting - photoreal or not a lot of games rely on SSR and shadowmaps that can be replaced with realtime RT.

    • rowanG07711 小时前
      Wow! if the industry is pushing it must be good right? Just like the megapixels wars for cameras.