New video format or more demanding music software is released that slows the machine down, or battery life craters.
Well, I haven’t had even a tinge of feeling that I need to upgrade after getting my M1 Pro MBP. I can’t remember it ever skipping a beat running a serious Ableton project, or editing in Resolve.
Can stuff be faster? Technically of course. But this is the first machine that even after several years I’ve not caught myself once wishing that it was faster or had more RAM. Not once.
Perhaps it’s my age, or perhaps it’s just the architecture of these new Mac chips are just so damn good.
Apple's M1 came at a really interesting point. Intel was still dominating the laptop game for Windows laptops, but generational improvements felt pretty lame. A whole lot of money for mediocre performance gains, high heat output and not very impressive battery. The laptop ecosystem changed rapidly as not only the Apple M1 arrived, but also AMD started to gain real prominence in the laptop market after hitting pretty big in the desktop and data center CPU market. (Addendum: and FWIW, Intel has also gotten a fair bit better at mobile too in the meantime. Their recent mobile chipsets have shown good efficiency improvements.)
If Qualcomm's Windows on ARM efforts live past the ARM lawsuit, I imagine a couple generations from now they could also have a fairly compelling product. In my eyes, there has never been a better time to buy a laptop.
(Obligatory: I do have an M2 laptop in my possession from work. The hardware is very nice, it beats the battery life on my AMD laptop even if the AMD laptop chews through some compute a bit faster. That said, I love the AMD laptop because it runs Linux really well. I've tried Asahi on an M1 Mac Mini, it is very cool but not something I'd consider daily driving soon.)
You say that, but I get extremely frustrated at how slow my Surface Pro 10 is (with an Ultra 7 165U).
It could be Windows of course, but this is a much more modern machine than my Macbook Air (M1) and feels like it's almost 10 years old at times in comparison. - despite being 3-4 years newer.
That said, Intel still has yet to catch up to AMD on efficiency unfortunately, they've improved generationally but if you look at power efficiency benchmarks of Intel CPUs vs AMD you can see AMD comfortably owns the entire top of the chart. Also, as a many-time Microsoft Surface owner, I can also confirm that these devices are rarely good showcases for the chipsets inside of them: they tend to be constrained by both power and thermal limits. There are a lot of good laptops on the market, I wouldn't compare a MacBook, even a MacBook Air, a laptop, with a Surface Pro, a 2-in-1 device. Heck, even my Intel Surface Laptop 4, a device I kinda like, isn't the ideal showcase for its already mediocre 11th gen Intel processor...
The Mac laptop market is pretty easy: you buy the laptops they make, and you get what you get. On one hand, that means no need to worry about looking at reviews or comparisons, except to pick a model. They all perform reasonably well, the touchpad will always be good, the keyboard is alright. On the other hand, you really do get what you get: no touchscreens, no repairability, no booting directly into Windows, etc.
And it's not the same - running Windows natively on Mac would seriously degrade the Mac, while running macOS on a PC has no reason to make it worse than with Windows. Why not buy a PC laptop at that point? The close hardware/OS integration is the whole point of the product. Putting Windows into a VM lets you use best of both.
I'm pretty sure you would never use a Windows PC just to boot into a macOS VM, even if it was flawless. And there are people who would never boot a Mac, just to boot into a Windows VM, even if it was flawless. And no, it's not flawless. Being able to run a relatively old strategy game is not a great demonstration of the ability generally play any random Windows game. I have a Parallels and VMWware Fusion license (well... Had, anyway), and I'm a long time (20 years) Linux user, I promise that I am not talking out my ass when it comes to knowing all about the compromises of interoperability software.
To be clear, I am not trying to tell you that the interoperability software is useless, or that it doesn't work just fine for you. I'm trying to say that in a world where the marketshare of Windows is around 70%, a lot of people depend on software and workflows that only work on Windows. A lot of people buy PCs specifically to play video games, possibly even as a job (creating videos/streaming/competing in esports teams/developing video games and related software) and they don't want additional input latency, lower performance, and worse compatibility.
Even the imperfections of virtual machines aside, some people just don't like macOS. I don't like macOS or Windows at all. I think they are both irritating to use in a way that I find hard to stomach. That doesn't mean that I don't acknowledge the existence of many people who very much rely on their macOS and Windows systems, the software ecosystems of their respective systems, and the workflows that they execute on those systems.
So basically, aside from the imperfections of a virtual machine, the ability to choose to run Windows as your native operating system is really important for the obvious case where it's the operating system you would prefer to run.
Battery life is decent.
At this point I’m not switching from laptop Linux. The machines can even game (thanks proton/steam)
https://browser.geekbench.com/macs/macbook-pro-14-inch-2021-...
https://browser.geekbench.com/v6/cpu/4260192
Both of these CPUs perform well enough that most users will not need to be concerned at all about the compute power. Newer CPUs are doing better but it'd be hard to notice day-to-day.
As for other laptop features... That'll obviously be vendor-dependent. The biggest advantage of the PC market is all of the choices you get to make, and the biggest disadvantage of the PC market is all of the choices you have to make. (Edit: Though if anyone wants a comparison point, just for sake of argument, I think generally the strongest options have been from ASUS. Right now, the Zephyrus G16 has been reviewing pretty good, with people mostly just complaining that it is too expensive. Certainly can't argue with that. Personally, I run Framework, but I don't really run the latest-and-greatest mobile chipsets most of the time, and I don't think Framework is ideal for people who want that.)
those are another two reasons why I can't ignore Apple Silicon
My Skylake one (I think that would be 6 years old now?) is doing absolutely fine. My Broadwell one is starting to feel a little aged but perfectly usable, I wouldn't even _consider_ upgrading it if I was in the bottom 95% of global income.
Compiling is very slow on these, but I think I'd avoid compilation on my laptop even if I had a cutting edge CPU?
YMMV.
I've had my xps 13 since 2016. Really the only fault I have against it nowadays is that 8gb of ram is not sufficient to run intellij anymore (hell, sometimes it even bogs down my 16gb mbp).
Now, I've also built an absolute beast of a workstation with a 7800x3d, 64gb ram, 24 gb vram and a fast ssd. Is it faster than both? Yeah. Is my old xps slow enough to annoy me? Not really. Youtube has been sluggish to load / render here lately but I think that's much more that google is making changes to make firefox / ublock a worse experience than any fault of the laptop.
FWIW, Qualcomm cancelled orders of its Windows devkit and issued refunds before the lawsuit. That is probably not a good sign
My work machine was upgraded from an M1 with 16GB of RAM to an M3 Max with 36GB and the difference in Xcode compile times is beyond belief: I went from something like 1-2 minutes to 15-20 seconds.
Obviously if opening a browser is the most taxing thing your machine is doing the difference will be minimal. But video or music editing, application-compiling and other intensive tasks, then the upgrade is PHENOMENAL.
I upgraded from a 13 pro to a 15 pro expecting zippier performance and it feels almost identical if not weirdly a bit slower in rendering and typing
I wonder what it will take to make Mac/iOS feel faster
I went from an iPhone 13 mini to an iPhone 16 and it's a significant speed boost.
The new camera button is kinda nice though.
I was initially indifferent about the camera button, but now that I'm used to it it's actually very useful.
I know, disabling shadows and customisable animation times ;) On a jailbroken phone I once could disable all animation delays, it felt like a new machine (must add that the animations are very important and generally great ux design, but most are just a tad too slow)
The CPU? Ah, never really felt a difference.
Infuriated by the 13.
The 3.5mm audio thunder bolt adapters disconnect more often than usual. All I need to do is tap the adapter and it disconnects.
And that Apple has now stopped selling them is even more infuriating, it's not a faulty adapter.
I use a small Anker USB-A to USB-C adapter [1]. They're rock solid.
As great as the AirPod Pro 2s are, a wired connection is superior in terms of reliability and latency. Although greatly improved over the years, I still have occasional issues connecting or switching between devices.
Out of curiosity, what's the advantage of a jailbroken iPhone nowadays? I'd typically unlock Android phones in the past, but I don't see a need on iOS today.
Interestingly, the last time I used Android, I had to sideload Adguard (an adblocker). On the App Store, it's just another app alongside competing adblockers. No such apps existed in the Play Store to provide system-level blocking, proxying, etc. Yes, browser extensions can be used, but that doesn't cover Google's incessant quest to bypass adblockers (looking at you Google News).
[0] https://www.audio-technica.com/en-us/ath-m50xsts [1] https://www.amazon.com/Adapter-Anker-High-Speed-Transfer-Not...
I have custom scripts, Ad blocking without VPNs, Application firewalls.
I enjoy having most-full control of my device.
The what? is this the adapter for 3.5mm headphones? If so, you don't have to get Apple made dongles. Third parties make them also.
I'd guess the GPs actual problem is lint in the Lightning port though. Pretty common, relatively easy to clean out too, especially compared to USB-C.
Regardless of either, they both have the same fault.
The connector between the phone and the adapter is poor. It could just be a fault with my phone but I have no way of proving this.
I suspect this sounds like a problem with your specific phone. Never had a problem with any lightning accessories myself.
But it is wild that two years ago running any sort of useful genAI stuff on a MBP was more-or-less a theoretical curiosity, and already today you can easily run models that would have exceeded SotA 2 years ago.
Somewhat ironically, I got into the "AI" space a complete skeptic, but thinking it would be fun to play with nonetheless. After 2 years of daily work with this models I'm starting to be increasingly convinced they are going to become increasingly disruptive. No AGI, but it will certainly reduce a lot of labor and enable things that we're really feasible before. Best of all, it's clear a lot of this work will be doable from a laptop!
I upgraded my M1 MBP to a MacBook Air M3 15" and it was a major upgrade. It is the same weight but 40% faster and so much nicer to work on while on the sofa or traveling. The screen is also brighter.
I think very few people actually do need the heavy MBPs, especially not the web/full-stack devs who populate Hacker News.
EDIT: The screens are not different in terms of brightness.
Air doesn't support 120Hz refresh either.
There's an app that allows to unlock max brightness on Pros (Vivid)[0] even without HDR content (no affiliation).
HDR support is most noticeable when viewing iPhone photos and videos, since iPhones shoots in HDR by default.
I may or may have not seen HDR content accidentally, but I’m not sure.
I’m looking forward to the day I notice the difference so I can appreciate what I have.
I can’t understand the people who notice the 120 hz adaptive refresh whatever and one guess is their use is a lot twitchier than mine.
I am due to update my Mac mini because my current one can't run Sonoma, but, apart from that, it's a lovely little box with more than enough power for me.
The modern AMD or Intel desktops I've tried obviously are much faster when performing large builds and such but for general computing, web browsing, and so forth I literally don't feel much of a difference. Now for mobile devices it's a different story due to the increased efficiency and hence battery life.
And yes. Web apps are not really great on low-spec machines.
The latest of whatever you have will be so much better than the intel one, and the next advances will be so marginal, that it's not even worth looking at a buyer's guide.
A 16gb model for about a thousand bucks?? I can’t believe how far macbooks have come in the last few years
Where this might shift is as we start using more applications that are powered by locally running LLMs.
The MacBook Pro seems like it does have some quality of life improvements such as Thunderbolt 5, the camera is now a center stage (follows you) 14 megapixel camera now all of them have three USB-C ports and the battery life claims of 22-24 hours. Regardless if you want a MacBook Pro and you don't have one there is now an argument on not just going to buy the previous model.
but yes, I was looking at and anticipating the max RAM on the M4 as well as the max memory speed
128gb and 546GB/s memory bandwidth
I like it, I don't know yet on an upgrade. But I like it. Was hoping for more RAM actually, but this is nice.
Why would people feel the need to upgrade?
And this applies already to phones. Laptops have been slowing for even longer.
Apple has been shipping "neural" processors for a while now, and when software with local inference starts landing, Apple hardware will be a natural place for it. They'll get to say "Your data, on your device, working for you; no subscription or API key needed."
Getting an extra five years of longevity (after RAM became fixed) for an extra 10% was a no-brainer imho.
It is absolutely, 100%, no doubt in my mind: the hardware.
The only reason the 2009 one now gets little use, is its motherboard now has some electronic issues, otherwise it would serve me perfectly well.
Other than that it cruises across all other applications. Hard to justify an upgrade purely for that one issue when everything else is so solid. But it does make the eyes wander...
Only recently I noticed some slowness. I think Google Photos changed something and they show photos in HDR and it causes unsmooth scrolling. I wonder if it's something fixable on Google's side though.
In fact, I bought a highly discounted Mac Studio with M1 Ultra because the M1 is still so good and it gives me 10Gbit ethernet, 20 cores and a lot of memory.
The only thing I am thinking about is going back to the MacBook Air again since I like the lighter form factor. But the display, 24 GiB max RAM and only 2 Thunderbolt ports would be a significant downgrade.
One good use case for 32gb Mac is being able to run 8b models at full precision, something that is not possible with 8-16gb macs
> Up to 7x faster image processing in Affinity Photo when compared to the 13‑inch MacBook Pro with Core i7, and up to 1.8x faster when compared to the 13-inch MacBook Pro with M1.
and probably it's good that at least one of the big players has a business model that supports driving that forward
I feel the same of my laptop of 2011 so I guess it is partly age (not feeling the urge to always have the greatest) and partly it is non LLM and gaming related computing is not demanding enough to force us to upgrade.
The last few years Chrome seems to have stepped up energy and memory use, which impacts most casual use these days. Safari has also become more efficient, but it never felt bloated the way Chrome used to.
That said, they are in a very comfortable position right now, with neither Intel, AMD, or another competitor able to produce anything close to the bang-for-watt that Apple is managing. Little pressure from behind them to push for more performance.
It seems like they bump the base frequency of the CPU cores with every revision to get some easy performance gains (the M1 was 3.2 GHz and the M3 is now 4.1 GHz for the performance cores), but it looks like this comes at the cost of it not being able to maintain the performance; some M3 reviews noted that the system starts throttling much earlier than an M1.
The base model is perfect. Now to decide between the M3/M4 Air and the M4 Pro.
Only downside is the screen. The brightness sort of has to be maxed out to be readable and viewing at a wrong angle makes even that imperfect
That said it’s about the same size / weight as an iPad Pro which feels much more portable than a pro device
But it's a heavy brick with a short battery life compared to the M1/2/3 Mac.
Has nothing whatsoever to do with CPU/memory/etc.
I’ve tried a bunch of ways to do this - and frankly the translation overhead is absolute pants currently.
Not a showstopper though, for the 20-30% of complete pain in the ass cases where I can’t easily offload the job onto a VPS or a NUC or something, I just have a ThinkPad.
Frankly though, if the mac mini was a slightly lower price point I'd definitely create my own mac mini cluster for my AI home lab.
That's me, I don't give a shit about AI, video editing, modern gaming or Kubernetes. That newest and heaviest piece of software I care about is VSCode. So I think you're absolutely correct. Most things new since Docker and VSCode has not contributed massively to how I work and most of the things I do could be done just fine 8-10 years ago.
To me it's more like 3d printing as a niche/hobby.
< 1% of all engagement with a category thing is niche/hobby, yes.
How old are you?
"Bro" has been gender neutral for over a decade. Males and females under the age of 25 call each other "bro" all the time.
Example?
What I do know is that Linux constantly breaks stuff. I don't even think it's treading water. These are interfaces are actively getting worse.
I have a Macbook Air M1 that I'd like to upgrade, but they're not making it easy. I promised myself a couple of years ago I'll never buy a new expensive computing device/phone unless it supports 120 hertz and Wi-Fi 7, a pretty reasonable request I think.
I got the iPhone 16 Pro, guess I can wait another year for a new Macbook (hopefully the Air will have a decent display by then, I'm not too keen to downgrade the portability just to get a good display).
The quality stuff retains value, not brand.
They have the highest product quality of any laptop manufacturer, period. But to say that all Apple products hold value well is simply not true. All quality products hold value well, and most of Apples products are quality.
I guarantee you that if Apple produced a trashy laptop it would have no resell value.
Again, the quality holds the value not the brand.
That said, they did suffer from some self inflicted hardware limitations, as you hint. One reason I like the MBP is the return of the SD card slot.
(Old Pentium Pro, PII, multi chip desktop days) -- When I did a different type of work, I would be in love with these new chips. I just don't throw as much at my computer anymore outside of things being RAM heavy.
The M1 (with 16 GB ram) is really an amazing chip. I'm with you, outside of a repair/replacement? I'm happy to wait for 120hz refresh, faster wifi, and longer battery life.
They always have. If you want an objective measure of planned obsolescence, look at the resale value. Apple products hold their resale value better than pretty much every competitor because they stay useful for far longer.
> MacBook Air: The World’s Most Popular Laptop Now Starts at 16GB
> MacBook Air is the world’s most popular laptop, and with Apple Intelligence, it’s even better. Now, models with M2 and M3 double the starting memory to 16GB, while keeping the starting price at just $999 — a terrific value for the world’s best-selling laptop.
I’ve been using the exact model for about a year and I rarely find limitations for my typical office type work. The only time I’ve managed to thermally throttle it has been with some super suboptimal Excel Macros.
I believe the rumor is that the MacBook Air will get the update to M4 in early spring 2025, February/March timeline.
That said, I believe you. Some press gets a hands-on on Wednesday (today) so unless they plan to pre-announce something (unlikely) or announce software only stuff, I think today is it.
Also, Studio and Pro are hanging there.
The streaming apps virtually all support downloading for offline viewing on iPhone, but the Apple TV just becomes a paperweight when the internet goes out, because I'm not allowed to use the 128GB of storage for anything.
If they're not going to let you use the onboard storage, then it seems unlikely for them to let you use USB storage. So, first, I would like them to change their app policies regarding internal storage, which is one of the purely software improvements I would like to see.
Unfortunately Apple won’t tell you until the day they sell the machines.
Just kidding! As an Apple Shareholder I feel like you should take what Apple gives you and accept the price. ;)
Only in US it seems. India got a price increase by $120.
Makes me wonder what else will be updated this week (Studio or Mac Pro)?
And yes, with enough RAM, it is a surprisingly good dev laptop.
6 years of insulting their customers with DOA useless hardware. The reality is that zero people will "not run into issues" with 8 gb of ram and a gimped 256gb SSD for caching.
No Wifi 7. So you get access to the 6 GHz band, but not some of the other features (preamble punching, OFDMA):
* https://en.wikipedia.org/wiki/Wi-Fi_7
* https://en.wikipedia.org/wiki/Wi-Fi_6E
The iPhone 16s do have Wifi 7. Curious to know why they skipped it (and I wonder if the chipsets perhaps do support it, but it's a firmware/software-not-yet-ready thing).
I had just assumed that for sure this would be the year I upgrade my M1 Max MBP to an M4 Max. I will not be doing so knowing that it lacks WiFi 7; as one of the child comments notes, I count on getting a solid 3 years out of my machine, so future-proofing carries some value (and I already have WiFi7 access points), and I download terabytes of data in some weeks for the work I do, and not having to Ethernet in at a fixed desk to do so efficiently will be a big enough win that I will wait another year before shelling out $6k “off-cycle”.
Big bummer for me. I was looking forward to performance gains next Friday.
https://www.tomsguide.com/face-off/wi-fi-6e-vs-wi-fi-7-whats...
Laptops/desktops (with 16GB+ of memory) could make use of the faster speed/more bandwidth aspects of WiFi7 better than smartphones (with 8GB of memory).
Machines can last and be used for years, and it would be a presumably very simple way to 'future proof' things.
And though the IEEE spec hasn't officially been ratified as I type this, it is set to be by the end of 2024. Network vendors are also shipping APs with the functionality, so in coming years we'll see a larger and larger infrastructure footprint going forward.
One of the features is preamble punching, which is useful in more dense environments:
* https://community.fs.com/article/how-preamble-puncturing-boo...
* https://www.ruckusnetworks.com/blog/2023/wi-fi-7-and-punctur...
MLO helps with resiliency and the improved OFDMA helps with spectrum efficiency as well. It's not just about speed.
Unrelated but unified memory is a strange buzzword being used by Apple. Their memory is no different than other computers. In fact, every computer without a discrete GPU uses a unified memory model these days!
On PC desktops I always recommend getting a mid-range tower server precisely for that reason. My oldest one is about 8 years old and only now it's showing signs of age (as in not being faster than the average laptop).
Or you could buy a M3 max laptop for $4k, get 10+ hour battery life, have it fit in a thin/light laptop, and still get 546GB/sec. However those are peak numbers. Apple uses longer cache lines (double), large page sizes (quadruple), and a looser memory model. Generally I'd expect nearly every memory bandwidth measure to win on Apple over AMD's turin.
So apple manages decent GPU performance, a tiny package, and great battery life. It's much harder on the PC side because every laptop/desktop chip from Intel and AMD use a 128 bit memory bus. You have to take a huge step up in price, power, and size with something like a thread ripper, xeon, or epyc to get more than 128 bit wide memory, none of which are available in a laptop or mac mini size SFF.
Memory interface width of modern CPUs is 64-bit (DDR4) and 32+32 (DDR5).
No CPU uses 128b memory bus as it results in overfetch of data, i.e., 128B per access, or two cache lines.
AFAIK Apple uses 128B cache lines, so they can do much better design and customization of memory subsystem as they do not have to use DIMMs -- they simply solder DRAM to the motherboard, hence memory interface is whatever they want.
Sure, per channel. PCs have 2x64 bit or 4x32 bit memory channels.
Not sure I get your point, yes PCs have 64 bit cache lines and apple uses 128. I wouldn't expect any noticeable difference because of this. Generally cache miss is sent to a single memory channel and result in a wait of 50-100ns, then you get 4 or 8 bytes per cycle at whatever memory clock speed you have. So apple gets twice the bytes per cache line miss, but the value of those extra bytes is low in most cases.
Other bigger differences is that apple has a larger page size (16KB vs 4KB) and arm supports a looser memory model, which makes it easier to reach a large fraction of peak memory bandwidth.
However, I don't see any relationship between Apple and PCs as far as DIMMS. Both Apple and PCs can (and do) solder dram chips directly to the motherboard, normally on thin/light laptops. The big difference between Apple and PC is that apple supports 128, 256, and 512 bit wide memory on laptops and 1024 bit on the studio (a bit bigger than most SFFs). To get more than 128 bits with a PC that means no laptops, no SFFs, generally large workstations with Xeon, Threadrippers, or Epyc with substantial airflow and power requirements
Also important to consider that the RTX 4090 has a relatively tiny 384-bit memory bus. Smaller than the M1 Max's 512-bit bus. But the RTX 4090 has 1 TB/s bandwidth and significantly more compute power available to make use of that bandwidth.
It's not really a new idea, just unusual in computers. The custom SOCs that AMD makes for Playstation and Xbox have wide (up to 384-bit) unified memory buses, very similar to what Apple is doing, with the main distinction being Apples use of low-power LPDDR instead of the faster but power hungrier GDDR used in the consoles.
And no, manaskarekar, the M4 Max does 546 GB/s not GBps (which would be 8x less!).
GB/s and GBps mean the same thing, though GB/s is the more common way to express it. Gb/s and Gbps are the units that are 8x less: bits vs Bytes.
GB/s is the same thing as GBps
The "ps" means "per second"
And yes, the impressive part is that this kind of bandwidth is hard to get on laptops. I suppose I should have been a bit more specific in my remark.
Servers do have many channels but they run relatively slower memory
* Specifically, it being on-die
Also, DRAM is never on-die. On-package, yes, for Apple's SoCs and various other products throughout the industry, but DRAM manufacturing happens in entirely different fabs than those used for logic chips.
https://en.wikipedia.org/wiki/DDR5_SDRAM (info from the first section):
> DDR5 is capable of 8GT/s which translates to 64 GB/s (8 gigatransfers/second * 64-bit width / 8 bits/byte = 64 GB/s) of bandwidth per DIMM.
So for example if you have a server with 16 DDR5 DIMMs (sticks) it equates to 1,024 GB/s of total bandwidth.
DDR4 clocks in at 3.2GT/s and the fastest DDR3 at 2.1GT/s.
DDR5 is an impressive jump. HBM is totally bonkers at 128GB/s per DIMM (HBM is the memory used in the top end Nvidia datacenter cards).
Cheers.
Not quite as it depends on number of channels and not on the number of DIMMs. An extreme example: put all 16 DIMMs on single channel, you will get performance of a single channel.
An RTX4090 or H100 has memory extremely close to the processor but I don't think you would call it unified memory.
A huge part of optimizing code for discrete GPUs is making sure that data is streamed into GPU memory before the GPU actually needs it, because pushing or pulling data over PCIe on-demand decimates performance.
Bandwidth (GB/s) = (Data Rate (MT/s) * Channel Width (bits) * Number of Channels) / 8 / 1000
(8800 MT/s * 64 bits * 8 channels) / 8 / 1000 = 563.2 GB/s
This is still half the speed of a consumer NVidia card, but the large amounts of memory is great, if you don't mind running things more slowly and with fewer libraries.
But it has more than 2x longer battery life and a better keyboard than a GPU card ;)
Was this example intended to describe any particular device? Because I'm not aware of anything that operates at 8800 MT/s, especially not with 64-bit channels.
Most laptops will be 2 DIMMS (probably soldered).
The vast majority of any x86 laptop or desktops are 128 bits wide. Often 2x64 bit channels up till last year or so, now 4x32 bit DDR5 in the last year or so. There are some benefits to 4 channels over 2, but generally you are still limited by 128 bits unless you buy a Xeon, Epyc, or Threadripper (or Intel equiv) that are expensive, hot, and don't fit in SFFs or laptops.
So basically the PC world is crazy behind the 256, 512, and 1024 bit wide memory busses apple has offered since the M1 arrived.
EDIT: wtf what's so bad about this comment that it deserves being downvoted so much
I think it’s super interesting to know real life workflows and performance of different LLMs and hardware, in case you can direct me to other resources. Thanks !
Smart move by Apple
Right?
Somewhat niche case, I know.
:P
I wonder if that has changed or is about to change as Apple pivots their devices to better serve AI workflows as well.
The M4-Max I just ordered comes with 128GB of RAM.
- Apple: all the capacity and bandwidth, but no compute to utilize it
- AMD/Nvidia: all the compute and bandwidth, but no capacity to load anything
- DDR5: all the capacity, but no compute or bandwidth (cheap tho)
I insist my 2020 Macbook M1 was the best purchase I ever made
- More RAM, primarily for local LLM usage through Ollama (a bit more overhead for bigger models would be nice)
- A bit niche, but I often run multiple external displays. DisplayLink works fine for this, but I also use live captions heavily and Apple's live captions don't work when any form of screen sharing/recording is enabled... which is how Displaylink works. :(
Not quite sold yet, but definitely thinking about it.
I've never kept any laptop as long as I've kept the M1. I was more or less upgrading yearly in the past because the speed increases (both in the G4 and then Intel generations) were so significant. This M1 has exceeded my expectations in every category, it's faster quieter and cooler than any laptop i've ever owned.
I've had this laptop since release in 2020 and I have nearly 0 complaints with it.
I wouldn't upgrade except the increase in memory is great, I don't want to have to shut down apps to be able to load some huge LLMs, and, I ding'ed the top case a few months ago and now there's a shadow on the screen in that spot in some lighting conditions which is very annoying.
I hope (and expect) the M4 to last just as long as my M1 did.
M1 series machines are going to be fine for years to come.
Bought a reasonably well-specced Intel Air for $1700ish. The M1s came out a few months later. I briefly thought about the implication of taking a hit on my "investment", figured I might as well cry once rather than suffer endlessly. Sold my $1700 Intel Air for $1200ish on craigslist (if I recall correctly), picked up an M1 Air for about that same $1200 pricepoint, and I'm typing this on that machine now.
That money was lost as soon as I made the wrong decision, I'm glad I just recognized the loss up front rather than stewing about it.
That said...scummy move by Apple. They tend to be a little more thoughtful in their refresh schedule, so I was caught off guard.
The battery performance is incredible too.
Also, any recommendations for suitable ssds, ideally not too expensive? Thank you!
With a TB4 case with an NVME you can get something like 2300MB/s read speeds. You can also use a USB4 case which will give you over 3000MB/s (this is what I'm doing for storing video footage for Resolve).
With a TB5 case you can go to like 6000MB/s. See this SSD by OWC:
I own a media production company. We use Sabrent Thunderbolt external NVMe TLC SSDs and are very happy with their price, quality, and performance.
I suggest you avoid QLC SSDs.
I have 2-4TB drives from Samsung, WD and Kingston. All work fine and are ridiculously fast. My favourite enclosure is from DockCase for the diagnostic screen.
i tried another brand or 2 of enclosures and they were HUGE while the acasis was credit card sized (except thickness)
For video editing - even 8K RAW - you don't need insanely fast storage. A 10GBit/s external SSD will not slow you down.
Wish I could spin up a Linux OS on the hardware though. Not a bright spot for me.
It won't have all the niceties / hardware support of MacOS, but it seamlessly coexists with MacOS, can handle the GPU/CPU/RAM with no issues, and can provide you a good GNU/Linux environment.
I guess you could have a physical MBP in your house and connect it to some bring-your-own-infrastructure CI setup, but most people wouldn't want to do that.
(This isn't a dig on the Asahi project btw, I think it's great).
> ...the new MacBook Pro starts with 16GB of faster unified memory with support for up to 32GB, along with 120GB/s of memory bandwidth...
I haven't been an Apple user since 2012 when I graduated from college and retired my first computer, a mid-2007 Core2 Duo Macbook Pro, which I'd upgraded with a 2.5" SSD and 6GB of RAM with DDR2 SODIMMs. I switched to Dell Precision and Lenovo P-series workstations with user-upgradeable storage and memory... but I've got 64GB of RAM in the old 2019 Thinkpad P53 I'm using right now. A unified memory space is neat, but is it worth sacrificing that much space? I typically have a VM or two running, and in the host OS and VMs, today's software is hungry for RAM and it's typically cheap and upgradeable outside of the Apple ecosystem.
That's an architectural limitation of the base M4 chip, if you go up to the M4 Pro version you can get up to 48GB, and the M4 Max goes up to 128GB.
The M4 Pro goes up to 48 GB
The M4 Max can have up to 128 GB
The M4 Pro with 14‑core CPU & 20‑core GPU can do 48GB.
If you're looking for ~>36-48GB memory, here's the options:
$2,800 = 48GB, Apple M4 Pro chip with 14‑core CPU, 20‑core GPU
$3,200 = 36GB, Apple M4 Max chip with 14‑core CPU, 32‑core GPU
$3,600 = 48GB, Apple M4 Max chip with 16‑core CPU, 40‑core GPU
So the M4 Pro could get you a lot of memory, but less GPU cores. Not sure how much those GPU cores factor in to performance, I only really hear complaints about the memory limits... Something to consider if looking to buy in this range of memory.
Of course, a lot of people here probably consider it not a big deal to throw an extra 3 grand on hardware, but I'm a hobbyist in academia when it comes to AI, I don't big 6-figure salaries :-)
M4 Max 14 core has a single option of 36GB.
M4 Max 16 core lets you go up to 128GB.
So you can actually get more ram with the Pro than the base level Max.
Although machines with Apple Silicon swap flawlessly, I worry about degrading the SSD, which is non-replaceable. So ultimately I pay for more RAM and not need swapping at all.
I've heard it's easier to just use cloud options, but I sill like the idea of being able to run actual models and train them on my laptop.
I have a M1 MacBook now and I'm considering trading in to upgrade.
I've seen somewhat conflicting things regarding what you get for the money. For instance, some reports recommending a M2 Pro for the money IIRC.
You can tell not because the system temp rises, but because suddenly Spotify audio begins to pop, constantly and irregularly.
It took me a year to figure out that the system audio popping wasn't hardware and indeed wasn't software, except in the sense that memory (or CPU?) pressure seems to be the culprit.
I hope that the response times have improved, because it has been quite poor for a 120 Hz panel.
The so-called "antiglare" option wasn't true matte. You'd really have to go back to 2008.
For my current laptop, I finally broke down and bought a tempered glass screen protector. It adds a bit of glare, but wipes clean — and for the first time I have a one-year-old MacBook that still looks as good as new.
(I tend to feel if you want something specialized, you gotta pay for the expensive model)
I might still keep it another year or so, which is a testament to how good it is and how relative little progress has happened in almost 10 years.
It’s essentially a matte coating, but the execution on iPad displays is excellent. While it doesn’t match the e-ink experience of devices like the Kindle or ReMarkable, it’s about 20-30% easier on the eyes. The texture feels also great (even though it’s less relevant for a laptop), and the glare reduction is a welcome feature.
I prefer working on the MacBook screen, but I nearly bought an Apple Studio Display XDR or an iPad as a secondary screen just for that nano-texture finish. It's super good news that this is coming to the MacBook Pro.
I am probably not the best example to emulate lol.
It's easier to read on it.
This is nice, and long overdue.
Great to see Affinity becoming so popular that it gets acknowledged by Apple.
It sounds more exciting than M4 is 12.5% faster than M3.
I've just ordered an (almost) top-of-the-range MBP Max, my current machine is an MBP M1-max, so the comparisons are pretty much spot-on for me.
Selling the M1 Ultra Studio to help pay for the M4 MBP Max, I don't think I need the Studio any more, with the M4 being so much faster.
It's actually interesting to think about. Is there a speed multiplier that would get me off this machine? I'm not sure there is. For my use case the machine performance is not my productivity bottleneck. HN on the otherhand... That one needs to be attenuated. :)
Back when Moore's law was still working they didn't skip generations like this.
M4 is built with TSMC's 2nd Gen 3nm process. M3 is on the 1st gen 3nm.
For the base M3 vs base M4:
- the CPU (4P+4E) & GPU (8) core counts are the same
- NPU perf is slightly better for M4, I think, (M4's 38TOPS @ INT8 vs M3's 18TOPS @ INT16)
- Memory Bandwidth is higher for M4 (120 GB/s vs 102.4 GB/s)
- M4 has a higher TDP (22W vs 20W)
- M4 has higher transistor count (28B vs 25B)
Everyone knows SSDs made a big difference in user experience. For the CPU, normally if you aren't gaming at high settings or "crunching" something (compiling or processing video etc.) then it's not obvious why CPU upgrades should be making much difference even vs. years-old Intel chips, in terms of that feel.
There is the issue of running heavy JS sites in browsers but I can avoid those.
The main issue seems to be how the OS itself is optimized for snappiness, and how well it's caching/preloading things. I've noticed Windows 10 file system caching seems to be not very sophisticated for example... it goes to disk too often for things I've accessed recently-but-not-immediately-prior.
Similarly when it comes to generating heat, if laptops are getting hot even while doing undemanding office tasks with huge periods of idle time then basically it points to stupid software -- or let's say poorly balanced (likely aimed purely at benchmark numbers than user experience).
https://nanoreview.net/en/cpu-compare/apple-m1-vs-amd-ryzen-...
Not all products got the M3, so in some lines this week is the first update in quite a while. In others like MBP it’s just the yearly bump. A good performing one, but the yearly bump.
I genuinely want to use it as primary machine but with this Intel MacBook Pro I have, I absolutely dislike FaceTime, IMessage, the need to use AppStore, Apple always asking me have a Apple user name password (which I don't and have zero intention), block Siri, and all telemetry stuff Apple has backed in, stop the machine calling home, etc.
This is to mirror tools available in Windows to disable and remove Microsoft bloatware and ad tracing built in.
Pretty much all the software I use is from brew.
If that's your question, yes - various options exist like https://asahilinux.org
Then Windows 11 came out.
My primary desktop & laptop are now both Macs because of all the malarkey in Win11. Reappearance of ads in Start and Windows Recall were the last straws. It's clear that Microsoft is actively trying to monetize Windows in ways that are inherently detrimental to UX.
I do have to say, though, that Win11 is still more customizable overall, even though it - amazingly! - regressed below macOS level in some respects (e.g. no vertical taskbar option anymore). Gaming is another major sticking point - the situation with non-casual games on macOS is dismal.
Get a PC.
That, combined with the icloud and telemetry BS, I'd had enough.
Another positive development was bumping up baseline amounts of RAM. They kept selling machines with just 8 gigabytes of RAM for way longer than they should have. It might be fine for many workflows, but feels weird on “pro” machines at their price points.
I’m sure Apple has been coerced to up its game because of AI. Yet we can rejoice in seeing their laptop hardware, which already surpassed the competition, become even better.
In January, after researching, I bought an apple restored MBP with an M2 Max over an M3 Pro/Max machine because of the performance/efficiency core ratio. I do a lot of music production in DAWs, and many, even Apple's Logic Pro don't really make use of efficiency cores. I'm curious about what restraints have led to this.. but perhaps this also factors into Apple's choice to increase the ratio of performance/efficiency cores.
I believe that’s the case. Most times, the performance cores on my M3 Pro laptop remain idle.
What I don’t understand is why battery life isn’t more like that of the MacBook Airs when not using the full power of the SOC. Maybe that’s the downside of having a better display.
Curious how you're measuring this. Can you see it in Activity Monitor?
> Maybe that’s the downside of having a better display.
Yes I think so. Display is a huge fraction of power consumption in typical light (browsing/word processing/email) desktop workloads.
I use an open source app called Stats [1]. It provides a really good overview of the system on the menu bar, and it comes with many customization options.
Yes, processor history in the activity monitor marks out specific cores as Performance and Efficiency.
Example: https://i.redd.it/f87yv7eoqyh91.jpg
Is an upgrade really worth it?
If you do any amount of 100% CPU work that blocks your workflow, like waiting for a compiler or typechecker, I think M1 -> M4 is going to be worth it. A few of my peers at the office went M1->M3 and like the faster compile times.
Like, a 20 minute build on M1 becoming a 10 minute build on M4, or a 2 minute build on M1 becoming a 1 minute build on M4, is nothing to scoff at.
I myself don’t need so much performance, so I tend to keep my devices for many, many years.
> All Apple Silicon Macs are in scope, as well as future generations as development time permits. We currently have support for most machines of the M1 and M2 generations.[^1][^2]
https://softwareengineeringdaily.com/2024/10/15/linux-apple-...
1. Nested virtualization doesn't work in most virtualization software, so if your workflow involves running stuff in VMs it is not going to work from within another VM. The exception is apparently now the beta version of UTM with the Apple Virtualization backend, but that's highly experimental.
2. Trackpad scrolling is emulated as discrete mouse wheel clicks, which is really annoying for anyone used to the smooth scrolling on macOS. So what I do is use macOS for most browsing and other non-technical stuff but do all my coding in the Linux VM.
It's 2024, and I still see most Windows users carrying a mouse to use with their laptop.
It also avoids the trouble of using a hosted LLM that decides to double their price overnight, costs are very predictable.
Not to mention that in the US the cell phone carriers artificially limit tethering speed or put data caps on it when you tether from your phone. You have to buy a dedicated data-only plan and modem.
Most cellular carriers offer unlimited on-device data plans, but they cap data for tethering. Integrating an LTE modem into a laptop essentially requires a mobile data plan with unlimited tethering - which, AFAIK, doesn’t exist at the moment. I’m not sure why.
Before the M4 models: omg, Apple only gives you 8GB RAM in the base model? Garbage!
After the M4 models: the previous laptops were so good, why would you upgrade?
The only downsides is that I see a kind of "burnt?" transparent spot on my screen. When connecting to an HDMI cable, the sound does not ouput properly to the TV screen, and makes the video I plat laggy. Wondering if I go to the Apple Store, would fix it?
Personal anecdote: don't get your hopes up. I've had my issues rejected as 'no fault found', but it's definitely worth spending a bit of time on.
On a side note, anyone know what database software was shown during the announcement?
https://www.datensen.com/data-modeling/luna-modeler-for-rela...
At long last, I can safely recommend the base model macbook air to my friends and family again. At $1000 ($900 with edu pricing on the m2 model) it really is an amazing package overall.
MBP: Apple M4 Max chip with 16‑core CPU, 40‑core GPU and 16‑core Neural Engine
Mac mini: Apple M4 Pro chip with 14‑core CPU, 20‑core GPU, 16-core Neural Engine
What kind of workload would make me regret not having bought MBP over Mac mini given the above. Thanks!
- photo/video editing
- games, or
- AI (training / inference)
that would benefit from the extra GPUs.
I think you need to pick the form factor that you need combined with the use case:
- Mobility and fast single core speeds: MacBook Air
- Mobility and multi-core: MacBook Pro with M4 Max
- Desktop with lots of cores: Mac Studio
- Desktop for single core: Mac mini
I really enjoy my MacBook Air M3 24GB for desktop + mobile use for webdev: https://news.ycombinator.com/item?id=41988340
The base model doesn't support thunderbolt 5.
And the base model still doesn't support more than 2 external displays without the DisplaySync (not DisplayPort!) hardware+software.
"M4 and M4 Pro
Simultaneously supports full native resolution on the built-in display at 1 billion colors and:
Up to two external displays with up to 6K resolution at 60Hz over Thunderbolt, or one external display with up to 6K resolution at 60Hz over Thunderbolt and one external display with up to 4K resolution at 144Hz over HDMI
One external display supported at 8K resolution at 60Hz or one external display at 4K resolution at 240Hz over HDMI"
"The display engine of the M4 family is enhanced to support two external displays in addition to a built-in display."
https://www.apple.com/newsroom/2024/10/apple-introduces-m4-p...
> Up to 4.6x faster build performance when compiling code in Xcode when compared to the 16‑inch MacBook Pro with Intel Core i9, and up to 2.2x faster when compared to the 16‑inch MacBook Pro with M1 Max.
OK, that's finally a reason to upgrade from my M1.
M3 pro has 192 bit wide memory, GPU improvements mostly offset the decrease in memory bandwidth. This leads to memory options like 96GB.
M4 pro has 256 bit wide memory, thus the factor of 2 memory options.
APPARENTLY NOT TODAY.
C'mon mannnnn. The 90s/y2k are back in! People want the colorful consumer electronics! It doesn't have to be translucent plastic like it was back then but give us at least something that doesn't make me wonder if I live in the novel The Giver every time I walk into a meetup filled with MacBook Pros.
I'm sure the specs are great.
I look at my local source vs the recording, and I am baffled.
After a decade of online meeting software, we still stream 480p quality it seems.
A huge part of group video chat is still "hacks" like downsampling non-speaking participants so the bandwidth doesn't kill the connection.
As we get fatter pipes and faster GPUs streaming will become better.
edit: I mean... I could see a future where realtime video feeds never get super high resolution and everything effectively becomes a relatively seemless AI recreation where only facial movement data is transmitted similar to how game engines work now.
A specced out Mac Studio (M2 being the latest model as of today) isn't cheap, but it can run 180B models, run them fast for the price, and use <300W of power doing it. It idles below 10W as well.
:/
Regarding LLMs, the hottest topic here nowadays, I plan to either use the cloud or return to a bare-metal PC.
Looking at how long the 8gb lasted it's a pretty sure bet that now you won't need to upgrade for a good few years.
I mean, I have a MacBook air with 16gb of ram and it's honestly working pretty well to this day. I don't do "much" on it though but not many people do.
I'd say the one incentive a MacBook Pro has over the air is the better a screens and better speakers. Not sure if it's worth the money.
> I mean, I have a MacBook air with 16gb of ram and it's honestly working pretty well to this day. I don't do "much" on it though but not many people do.
If an HN user can get along with 16gb on their MacBook Air for the last X years, most users were able to get by with 8gb.
They are supposed to be "green" but they encourage obsolescence.
8GB is fine for most use cases. Part of my gig is managing a huge global enterprise with six figures of devices. Metrics demonstrate that the lower quartile is ok with 8GB, even now. Those devices are being retired as part of the normal lifecycle with 16GB, which is better.
Laptops are 2-6 year devices. Higher end devices always get replaced sooner - you buy a high end device because the productivity is worth spending $. Low end tend to live longer.
Or you could get a framework and you could actually upgrade parts that are worth upgrading - instead of upgrade as in buying a new one
It's fine, but the issue is linux sleep/hibernate - battery drain. To use the laptop after a few days, I have to plug it in and wait for it to charge a little bit because the battery dies. I have to shut it down (not just close the screen) before flying or my backpack becomes a heater and the laptop dies. To use a macbook that's been closed for months I just open it and it works. I'll pay double for that experience. If I want a computer that needs to be plugged in to work I have a desktop for that already. The battery life is not good either.
Maybe it's better now if I take the time to research what to upgrade, but I don't have the time to tinker with hardware/linux config like I did a few years ago.
I don't really see a world where this machine doesn't last me a few more years. If there's anything i'd service would be the battery, but eh. It lasts more than a few hours and I don't go out much.
The compile times for Swift, the gigabytes of RAM everything seems to eat up.
I closed all my apps and I'm at 10gb of RAM being used - I have nothing open.
Does this mean the Macbook Air 8gb model I had 10 years ago would basically be unable to just run the operating system alone?
It's disconcerting. Ozempic for terrible food and car-centric infrastructure we've created, cloud super-computing and 'AI' for coping with this frankenstein software stack.
The year of the Linux desktop is just around the corner to save the day, right? Right? :)
It tells me my computer is using 8gb of RAM after a restart and I haven't begun to open or close anything.
When I open Xcode and tell it to run a macOS app I have made no changes to - I just timed it - it takes 30 seconds.
I take a print statement and add ONE CHARACTER to the print statement and press Run - it takes 30 seconds again.
No amount of hand waving can excuse this sort of nonsense in my mind.
The linked Apple Store page says "MacBook Pro blasts forward with the M3, M3 Pro, and M3 Max chips" so it seems like the old version of the page still?
This is misleading:
https://news.ycombinator.com/item?id=25074959
"macOS sends hashes of every opened executable to some server of theirs"
To be fair, the link in this story is to a press release. Arguably there are probably many things in it that can be considered "misleading" in certain contexts.
A more charitable interpretation is that Apple only thinks that people with computers a few years old need to upgrade, and they aren't advertising to people with a <1 year old MacBook Pro.
Are they going to claim that 16GB RAM is equivalent to 32GB on Intel laptops? (/sarc)
> Up to 23.8x faster basecalling for DNA sequencing in Oxford Nanopore MinKNOW when compared to the 16-inch MacBook Pro with Core i9, and up to 1.8x faster when compared to the 16-inch MacBook Pro with M1 Pro.
No space grey?!
don’t think it’s wise though, i bought a base m1 pro mbp when it launched and don’t feel a need to upgrade at all yet. i’m holding off for a few more years to grab whenever the next major increase in local llm capability and battery life comes.
If it affects your earning power to that extent, you should probably pony up and save in the long run, probably just a few years until you see returns.
Caste system usually can't be bypassed by paying a monthly subscription fee.
I will note that making it a subscription will tend to increase the overall costs, not decrease. In an environment with ready access to credit, I think offering on a subscription basis is worse for consumers?
If it matters that much to you just sell the old one and buy the new. That's your subscription.
Give us data, tell us whats new, and skip the nonsense buzz filling adjectives.
To quote Russell Brand, just say he sat down, not that he placed his luscious ass in silk covered trousers on a velvetly smooth chair, experiencing pleasure as the strained thigh muscles received respite after gruelling on their feet watching a lush sunset in a cool summers evening breeze.