You can argue about whether it's actually bulletproof or not but the fact is, nobody else is even trying, and have lost sight of all privacy-focused features in their rush to ship anything and everything on my device to OpenAI or Gemini.
I am thrilled to shell out thousands and thousands of dollars to purchase a machine that feels like it really belongs to me, from a company that respects my data and has aligned incentives.
I happen to be in the midst of a repair with Apple right now. And for me, the idea that they might replace my aging phone with a newer unit, is a big plus. As I think it would be for almost everyone. Aside from the occasional sticker, I don't have any custom hardware mods to my phone or laptop, and nor do 99.99% of people.
Can Apple please every single tech nerd 100% of the time? No. Those people should stick to Linux, so that they can have a terrible usability experience ALL the time, but feel more "in control," or something.
Did I say it would be a "new one"?
Because some got sued for doing that once, and people including myself are in line to get checks from it.
Genuinely asking: are there any specifics on this? I understand that blocking at the firewall level is an option, but I recall someone here mentioning an issue where certain local machine rules don’t work effectively. I believe this is the issue [1]. Has it been “fixed”?
[1] https://appleinsider.com/articles/21/01/14/apple-drops-exclu...
Everything is a tradeoff.
I’d love to live in the F droid alt tech land, but everything really comes down to utility. Messaging my friends is more important than using the right IM protocol.
Much as I wish I could convince everyone I know and have yet to meet to message me on Signal or whatever, that simply isn’t possible. Try explaining that I am not on Whatsapp or insta to a girl I’ve just met…
Also it is nice to spend basically no time maintaining the device, and have everything work together coherently. Time is ever more valuable past a certain point.
I'm curious: what hardware and software stack do you use?
How true is this when they devices are increasingly hostile to user repair and upgrades? MacOS also tightens the screws on what you can run and from where, or at least require more hoop jumping over time.
If you allowed third-party components without restraint, there'd be no way to prevent someone swapping out a component.
Lock-in and planned obsolescence are also factors, and ones I'm glad the EU (and others) are pushing back here. But it isn't as if there are no legitimate tradeoffs.
Regarding screw tightening... if they ever completely remove the ability to run untrusted code, yes, then I'll admit I was wrong. But I am more than happy to have devices be locked down by default. My life has gotten much easier since I got my elderly parents and non-technical siblings to move completely to the Apple ecosystem. That's the tradeoff here.
It works via your keychain and your contacts, and the recipient gets a little notification to allow you to view their screen.
That’s it - no downloads, no login, no 20 minutes getting a Remote Desktop screen share set up.
Yeah, but this is hacker news.
Not sure what you mean exactly by this, but to me their Self Service Repair program is a step in the right direction.
They could go out of their way to make things actually easy to work on and service, but that has never been the Apple Way. Compare to framework or building your own PC, or even repairing a laptop from another OEM.
I can neither repair nor upgrade my electric car, my furniture, or my plumbing. But they all still belong to me.
edit: also, unless you are the digital equivalent of "off the grid", I would argue most people are going to need some sort of cloud-based identity anyway for messaging, file-sharing, etc. iCloud is far and away the most secure of the options available to most users, and the only one that uses full end-to-end encryption across all services.
"You need some cloud-based identity, and this is the best one," even granting its premises, doesn't make being forced into this one a good thing. I'm an Apple user, but there are plenty of people I need to message and share files with who aren't in the Apple ecosystem.
But you're not forced. You completely ignored the other response in order to continue grinding an axe.
Can you explain what you mean by this? I have been doing software development on MacOS for the last couple of years and have found it incredibly easy to run anything I want on my computer from the terminal, whenever I want. Maybe I'm not the average user, but I use mostly open-source Unix tooling and have never had a problem with permissions or restrictions.
Are you talking about packaged applications that are made available on the App Store? If so, sure have rules to make sure the store is high-quality, kinda like how Costco doesn't let anyone just put garbage on their shelves
Amongst all the big tech companies Apple is the closest you will get to if you want Privacy.
That's such a security theater. As long as nobody can look inside their ICs, nobody knows what's really happening there.
> Apple oversells its differential privacy protections. "Apple’s privacy loss parameters exceed the levels typically considered acceptable by the differential privacy research community," says USC professor Aleksandra Korolova, a former Google research scientist who worked on Google's own implementation of differential privacy until 2014. She says the dialing down of Apple's privacy protections in iOS in particular represents an "immense increase in risk" compared to the uses most researchers in the field would recommend.
https://www.wired.com/story/apple-differential-privacy-short...
Assuming they go through with that, this alone puts them leagues ahead of any other cloud service.
It also means that to mine your data the way everyone else does, they would need to deliberately insert _hardware_ backdoors into their own systems, which seems a bit too difficult to keep secret and a bit too damning a scandal should it be discovered...
Occam's razor here is that they're genuinely trying to use real security as a competitive differentiator.
Chip | Geekbench Score (Process)
---- | ------------------------
M1 | 2,419 (5nm)
M2 | 2,658 (5nm)
M3 | 3,076 (3nm)
M4* | 3,810 (3nm)
In my experience, single-core CPU is the best all-around indicator of how "fast" a machine feels. I feel like Apple kind of buried this in their press release.M4 benchmark source: https://browser.geekbench.com/v6/cpu/8171874
Flameproof suit donned. Please correct me because I'm pretty ignorant about modern hardware. My main interest is playing lots of tracks live in Logic Pro.
It kind of just comes off as one of those YouTube liminal space horror videos when it's that empty.
Think about the early ipod ads, just individuals dancing to music by themselves. https://www.youtube.com/watch?v=_dSgBsCVpqo
You can even go back to 1983 "Two kinds of people": a solitary man walks into an empty office, works by himself on the computer and then goes home for breakfast. https://youtu.be/4xmMYeFmc2Q
It's weirdly dystopian. I didn't realize it bothered me until moments before my comment, but now I can't get it out of my head.
I would think that a brand that is at least trying to put some emphasis on privacy in their products would also extend the same principle to their workforce. I don’t work for Apple, but I doubt that most of their employees would be thrilled about just being filmed at work for a public promo video.
> liminal space horror
reminds me of that god awful crush commercial
This was reminder to me that art is subjective. I don’t get the outrage. I kinda like it.
> M4 Max supports up to 128GB of fast unified memory and up to 546GB/s of memory bandwidth, which is 4x the bandwidth of the latest AI PC chip. This allows developers to easily interact with large language models that have nearly 200 billion parameters.
Having more memory bandwidth is not directly helpful in using larger LLM models. A 200B param model requires 200GB RAM quantized down to "q8", and these laptops don't have 200GB RAM.
I insist my 2020 Macbook M1 was the best purchase I ever made
I've never kept any laptop as long as I've kept the M1. I was more or less upgrading yearly in the past because the speed increases (both in the G4 and then Intel generations) were so significant. This M1 has exceeded my expectations in every category, it's faster quieter and cooler than any laptop i've ever owned.
I've had this laptop since release in 2020 and I have nearly 0 complaints with it.
I wouldn't upgrade except the increase in memory is great, I don't want to have to shut down apps to be able to load some huge LLMs, and, I ding'ed the top case a few months ago and now there's a shadow on the screen in that spot in some lighting conditions which is very annoying.
I hope (and expect) the M4 to last just as long as my M1 did.
My 2015 MBP would like to have a word.
It’s the only laptop purchase I’ve made. I still use it to this day, though not as regularly.
I will likely get a new MBP one of these days.
When you upgrade, prepare to be astonished.
The performance improvement is difficult to convey. It's akin to traveling by horse and buggy. And then hopping into a modern jetliner, flying first class.
It's not just speed. Display quality, build quality, sound quality, keyboard quality, trackpad, ports, etc., have all improved considerably.
Last one with upgrade capabilities, now it has two fast SSDs and maximum Ram. I changed the battery once.
Only shame is that it doesn’t get major MacOS upgrades anymore.
Still good enough to browse the web, do office productivity and web development.
12 years of good use, I am not sure I can get so much value anywhere now
I also have a M1 from work that is absolutely wonderful, but I think it's time for me to upgrade the 2015 with one of these new M4s.
The longevity of Macbooks is insanely good.
Rebuilding a bunch of Docker images on an older intel mac is quite the slow experience if you're doing it multiple times per day.
I have either assembled my own desktop computers or purchased ex corporate Lenovo over the years with a mix of Windows (for gaming obviously) and Linux and only recently (4 years ago) been given a MBP by work as they (IT) cannot manage Linux machines like they do with MacOS and Windows.
I have moved from an intel i5 MBP to a M3 Pro (?) and it makes me want to throw away my dependable ThinkPad/Fedora machine I still uses for personal projects.
Could you get more money by selling it? Sure. But it's hard to be the convenience. They ship you a box. You seal up the old device and drop it off at UPS.
I also build my desktop computers with a mix of Windows and Linux. But those are upgraded over the years, not regularly.
What different lives we live. This first M1 was in November 2020. Not even four years old. I’ve never had a [personal] computer for _less_ time than that. (Work, yes, due to changing jobs or company-dictated changes/upgrades)
Me too. Only one complaint. After I accidentally spilled a cup of water into it on an airplane, it didn't work.
(However AppleCare fixed it for $300 and I had a very recent backup. :) )
What’s more annoying is that I’d jus to get a new one and recycle this one, but the SSD is soldered on. Good on you for having a backup.
Do not own a Mac unless you bought it used or have AppleCare.
And yeah, this incident reminded me of why it's important to back up as close to daily as you can, or even more often during periods when you're doing important work and want to be sure you have the intermediate steps.
- More RAM, primarily for local LLM usage through Ollama (a bit more overhead for bigger models would be nice)
- A bit niche, but I often run multiple external displays. DisplayLink works fine for this, but I also use live captions heavily and Apple's live captions don't work when any form of screen sharing/recording is enabled... which is how Displaylink works. :(
Not quite sold yet, but definitely thinking about it.
(of course, everyone else has a macbook too, there's always someone that can lend me a charger. Bonus points that the newer macbooks support both magsafe and USB-C charging. Added bonus points that they brought back magsafe and HDMI ports)
This is the best machine I have ever owned. It is so completely perfect in every way. I can't imagine replacing it for many many years.
M1 series machines are going to be fine for years to come.
Actually wasn't M1 itself an evolution / upscale of their A series CPUS that by now they've been working on since... before 2010, the iPhone 4 was the first one with their own CPU, although the design was from Samsung + Intrinsity, it was only the A6 that they claimed was custom designed by Apple.
Bought a reasonably well-specced Intel Air for $1700ish. The M1s came out a few months later. I briefly thought about the implication of taking a hit on my "investment", figured I might as well cry once rather than suffer endlessly. Sold my $1700 Intel Air for $1200ish on craigslist (if I recall correctly), picked up an M1 Air for about that same $1200 pricepoint, and I'm typing this on that machine now.
That money was lost as soon as I made the wrong decision, I'm glad I just recognized the loss up front rather than stewing about it.
That said...scummy move by Apple. They tend to be a little more thoughtful in their refresh schedule, so I was caught off guard.
The battery performance is incredible too.
If you need to do some work offline, or for some reason the place you work blocks access to cloud providers, it's not a bad way to go, really. Note that if you're on battery, heavy LLM use can kill your battery in an hour.
Unrelated but unified memory is a strange buzzword being used by Apple. Their memory is no different than other computers. In fact, every computer without a discrete GPU uses a unified memory model these days!
On PC desktops I always recommend getting a mid-range tower server precisely for that reason. My oldest one is about 8 years old and only now it's showing signs of age (as in not being faster than the average laptop).
Or you could buy a M3 max laptop for $4k, get 10+ hour battery life, have it fit in a thin/light laptop, and still get 546GB/sec. However those are peak numbers. Apple uses longer cache lines (double), large page sizes (quadruple), and a looser memory model. Generally I'd expect nearly every memory bandwidth measure to win on Apple over AMD's turin.
So apple manages decent GPU performance, a tiny package, and great battery life. It's much harder on the PC side because every laptop/desktop chip from Intel and AMD use a 128 bit memory bus. You have to take a huge step up in price, power, and size with something like a thread ripper, xeon, or epyc to get more than 128 bit wide memory, none of which are available in a laptop or mac mini size SFF.
It's not really a new idea, just unusual in computers. The custom SOCs that AMD makes for Playstation and Xbox have wide (up to 384-bit) unified memory buses, very similar to what Apple is doing, with the main distinction being Apples use of low-power LPDDR instead of the faster but power hungrier GDDR used in the consoles.
Memory interface width of modern CPUs is 64-bit (DDR4) and 32+32 (DDR5).
No CPU uses 128b memory bus as it results in overfetch of data, i.e., 128B per access, or two cache lines.
AFAIK Apple uses 128B cache lines, so they can do much better design and customization of memory subsystem as they do not have to use DIMMs -- they simply solder DRAM to the motherboard, hence memory interface is whatever they want.
Sure, per channel. PCs have 2x64 bit or 4x32 bit memory channels.
Not sure I get your point, yes PCs have 64 bit cache lines and apple uses 128. I wouldn't expect any noticeable difference because of this. Generally cache miss is sent to a single memory channel and result in a wait of 50-100ns, then you get 4 or 8 bytes per cycle at whatever memory clock speed you have. So apple gets twice the bytes per cache line miss, but the value of those extra bytes is low in most cases.
Other bigger differences is that apple has a larger page size (16KB vs 4KB) and arm supports a looser memory model, which makes it easier to reach a large fraction of peak memory bandwidth.
However, I don't see any relationship between Apple and PCs as far as DIMMS. Both Apple and PCs can (and do) solder dram chips directly to the motherboard, normally on thin/light laptops. The big difference between Apple and PC is that apple supports 128, 256, and 512 bit wide memory on laptops and 1024 bit on the studio (a bit bigger than most SFFs). To get more than 128 bits with a PC that means no laptops, no SFFs, generally large workstations with Xeon, Threadrippers, or Epyc with substantial airflow and power requirements
Also important to consider that the RTX 4090 has a relatively tiny 384-bit memory bus. Smaller than the M1 Max's 512-bit bus. But the RTX 4090 has 1 TB/s bandwidth and significantly more compute power available to make use of that bandwidth.
The M4 max is definitely not a 4090 killer, does not match it in any way. It can however work on larger models than the 4090 and have a battery that can last all day.
My memory is a bit fuzzy, but I believe the m3 max did decent on some games compared to the laptop Nvidia 4070 (which is not the same as the desktop 4070). But highly depended on if the game was x86-64 (requiring emulation) and if it was DX11 or apple native. I believe apple claims improvements in metal (the Apple's GPU lib) and that the m4 GPUs have better FP for ray tracing, but no significant changes in rasterized performance.
I look forward to the 3rd party benchmarks for LLM and gaming on the m4 max.
Many integrated graphics segregate the memory into CPU owned and GPU owned, so that even if data is on the same DIMM, a copy still needs to be performed for one side to use what the other side already has.
This means that the drivers, etc, all have to understand the unified memory model, etc. it’s not just hardware sharing DIMMs.
APUs with shared everything are not a new concept, they are actually older than programmable graphics coprocessors…
https://www.heise.de/news/Gamescom-Playstation-4-bietet-Unif...
And yes, the impressive part is that this kind of bandwidth is hard to get on laptops. I suppose I should have been a bit more specific in my remark.
Servers do have many channels but they run relatively slower memory
* Specifically, it being on-die
Also, DRAM is never on-die. On-package, yes, for Apple's SoCs and various other products throughout the industry, but DRAM manufacturing happens in entirely different fabs than those used for logic chips.
And no, manaskarekar, the M4 Max does 546 GB/s not GBps (which would be 8x less!).
GB/s and GBps mean the same thing, though GB/s is the more common way to express it. Gb/s and Gbps are the units that are 8x less: bits vs Bytes.
GB/s is the same thing as GBps
The "ps" means "per second"
https://en.wikipedia.org/wiki/DDR5_SDRAM (info from the first section):
> DDR5 is capable of 8GT/s which translates to 64 GB/s (8 gigatransfers/second * 64-bit width / 8 bits/byte = 64 GB/s) of bandwidth per DIMM.
So for example if you have a server with 16 DDR5 DIMMs (sticks) it equates to 1,024 GB/s of total bandwidth.
DDR4 clocks in at 3.2GT/s and the fastest DDR3 at 2.1GT/s.
DDR5 is an impressive jump. HBM is totally bonkers at 128GB/s per DIMM (HBM is the memory used in the top end Nvidia datacenter cards).
Cheers.
Not quite as it depends on number of channels and not on the number of DIMMs. An extreme example: put all 16 DIMMs on single channel, you will get performance of a single channel.
An RTX4090 or H100 has memory extremely close to the processor but I don't think you would call it unified memory.
A huge part of optimizing code for discrete GPUs is making sure that data is streamed into GPU memory before the GPU actually needs it, because pushing or pulling data over PCIe on-demand decimates performance.
Bandwidth (GB/s) = (Data Rate (MT/s) * Channel Width (bits) * Number of Channels) / 8 / 1000
(8800 MT/s * 64 bits * 8 channels) / 8 / 1000 = 563.2 GB/s
This is still half the speed of a consumer NVidia card, but the large amounts of memory is great, if you don't mind running things more slowly and with fewer libraries.
Was this example intended to describe any particular device? Because I'm not aware of anything that operates at 8800 MT/s, especially not with 64-bit channels.
I believe I saw somewhere that the actual chips used are LPDDR5X-8533.
Effectively the parents formula describes the M4 max, give or take 5%.
But it has more than 2x longer battery life and a better keyboard than a GPU card ;)
Most laptops will be 2 DIMMS (probably soldered).
The vast majority of any x86 laptop or desktops are 128 bits wide. Often 2x64 bit channels up till last year or so, now 4x32 bit DDR5 in the last year or so. There are some benefits to 4 channels over 2, but generally you are still limited by 128 bits unless you buy a Xeon, Epyc, or Threadripper (or Intel equiv) that are expensive, hot, and don't fit in SFFs or laptops.
So basically the PC world is crazy behind the 256, 512, and 1024 bit wide memory busses apple has offered since the M1 arrived.
EDIT: wtf what's so bad about this comment that it deserves being downvoted so much
Intel processor graphics architecture has long pioneered
sharing DRAM physical memory with the CPU.
This unified memory architecture offers [...]
It more or less seems like they use "unified memory" and "shared memory" interchangeably in that sectionCalling something "shared" makes you think: "there's not enough of it, so it has to be shared".
Calling something "unified" makes you think: "they are good engineers, they managed to unify two previously separate things, for my benefit".
I think it’s super interesting to know real life workflows and performance of different LLMs and hardware, in case you can direct me to other resources. Thanks !
An M2 is according to a reddit post around 27 tflops
So < 1/10 the performance of just computation. let alone the memory.
What workflow would use something like this?
Memory and memory bandwidth matters most for inferencing. 819.2 GB/s for M2 Ultra is less than half that of A100, but having 192GB of RAM instead of 80gb means they can run inference on models that would require THREE of those A100s and the only real cost is that it takes longer for the AI to respond.
3 A100 at $5300/mo each for the past 2 years is over $380,000. Considering it worked for them, I'd consider it a massive success.
From another perspective though, they could have bought 72 of those Ultra machines for that much money and had most devs on their own private instance.
The simple fact is that Nvidia GPUs are massively overpriced. Nvidia should worry a LOT that Apple's private AI cloud is going to eat their lunch.
Smart move by Apple
Right?
The M4-Max I just ordered comes with 128GB of RAM.
Somewhat niche case, I know.
:P
I wonder if that has changed or is about to change as Apple pivots their devices to better serve AI workflows as well.
- Apple: all the capacity and bandwidth, but no compute to utilize it
- AMD/Nvidia: all the compute and bandwidth, but no capacity to load anything
- DDR5: all the capacity, but no compute or bandwidth (cheap tho)
> MacBook Pro with M4 Pro is up to 3x faster than M1 Pro (13)
> (13) Testing conducted by Apple from August to October 2024 using preproduction 16-inch MacBook Pro systems with Apple M4 Pro, 14-core CPU, 20-core GPU, 48GB of RAM and 4TB SSD, and production 16-inch MacBook Pro systems with Apple M1 Pro, 10-core CPU, 16-core GPU, 32GB of RAM and 8TB SSD. Prerelease Redshift v2025.0.0 tested using a 29.2MB scene utilising hardware-accelerated ray tracing on systems with M4 Pro. Performance tests are conducted using specific computer systems and reflect the approximate performance of MacBook Pro.
So they're comparing software that uses raytracing present in the M3 and M4, but not in the M1. This is really misleading. The true performance increase for most workloads is likely to be around 15% over the M3. We'll have to wait for benchmarks from other websites to get a true picture of the differences.Edit: If you click on the "go deeper on M4 chips", you'll get some comparisons that are less inflated, for example, code compilation on pro:
14-inch MacBook Pro with M4 4.5x
14-inch MacBook Pro with M3 3.8x
13-inch MacBook Pro with M1 2.7x
So here the M4 Pro is 67% faster than the M1 Pro, and 18% faster than the M3 Pro. It varies by workload of course.No benchmarks yet, but this article gives some tables of comparative core counts, max RAM and RAM bandwidths: https://arstechnica.com/apple/2024/10/apples-m4-m4-pro-and-m...
Also, any recommendations for suitable ssds, ideally not too expensive? Thank you!
Here is the rabbit hole you might want to check out: https://dancharblog.wordpress.com/2024/01/01/list-of-ssd-enc...
With a TB4 case with an NVME you can get something like 2300MB/s read speeds. You can also use a USB4 case which will give you over 3000MB/s (this is what I'm doing for storing video footage for Resolve).
With a TB5 case you can go to like 6000MB/s. See this SSD by OWC:
I have 2-4TB drives from Samsung, WD and Kingston. All work fine and are ridiculously fast. My favourite enclosure is from DockCase for the diagnostic screen.
I own a media production company. We use Sabrent Thunderbolt external NVMe TLC SSDs and are very happy with their price, quality, and performance.
I suggest you avoid QLC SSDs.
My only complaint is that Apple gouges you for memory and storage upgrades. (But in reality I don't want the raw and rendered video taking up space on my machine).
Switched to samsung t9s, so far so good.
i tried another brand or 2 of enclosures and they were HUGE while the acasis was credit card sized (except thickness)
For video editing - even 8K RAW - you don't need insanely fast storage. A 10GBit/s external SSD will not slow you down.
No Wifi 7. So you get access to the 6 GHz band, but not some of the other features (preamble punching, OFDMA):
* https://en.wikipedia.org/wiki/Wi-Fi_7
* https://en.wikipedia.org/wiki/Wi-Fi_6E
The iPhone 16s do have Wifi 7. Curious to know why they skipped it (and I wonder if the chipsets perhaps do support it, but it's a firmware/software-not-yet-ready thing).
I had just assumed that for sure this would be the year I upgrade my M1 Max MBP to an M4 Max. I will not be doing so knowing that it lacks WiFi 7; as one of the child comments notes, I count on getting a solid 3 years out of my machine, so future-proofing carries some value (and I already have WiFi7 access points), and I download terabytes of data in some weeks for the work I do, and not having to Ethernet in at a fixed desk to do so efficiently will be a big enough win that I will wait another year before shelling out $6k “off-cycle”.
Big bummer for me. I was looking forward to performance gains next Friday.
https://www.tomsguide.com/face-off/wi-fi-6e-vs-wi-fi-7-whats...
Laptops/desktops (with 16GB+ of memory) could make use of the faster speed/more bandwidth aspects of WiFi7 better than smartphones (with 8GB of memory).
Machines can last and be used for years, and it would be a presumably very simple way to 'future proof' things.
And though the IEEE spec hasn't officially been ratified as I type this, it is set to be by the end of 2024. Network vendors are also shipping APs with the functionality, so in coming years we'll see a larger and larger infrastructure footprint going forward.
One of the features is preamble punching, which is useful in more dense environments:
* https://community.fs.com/article/how-preamble-puncturing-boo...
* https://www.ruckusnetworks.com/blog/2023/wi-fi-7-and-punctur...
MLO helps with resiliency and the improved OFDMA helps with spectrum efficiency as well. It's not just about speed.
Wish I could spin up a Linux OS on the hardware though. Not a bright spot for me.
It won't have all the niceties / hardware support of MacOS, but it seamlessly coexists with MacOS, can handle the GPU/CPU/RAM with no issues, and can provide you a good GNU/Linux environment.
I guess you could have a physical MBP in your house and connect it to some bring-your-own-infrastructure CI setup, but most people wouldn't want to do that.
(This isn't a dig on the Asahi project btw, I think it's great).
> ...the new MacBook Pro starts with 16GB of faster unified memory with support for up to 32GB, along with 120GB/s of memory bandwidth...
I haven't been an Apple user since 2012 when I graduated from college and retired my first computer, a mid-2007 Core2 Duo Macbook Pro, which I'd upgraded with a 2.5" SSD and 6GB of RAM with DDR2 SODIMMs. I switched to Dell Precision and Lenovo P-series workstations with user-upgradeable storage and memory... but I've got 64GB of RAM in the old 2019 Thinkpad P53 I'm using right now. A unified memory space is neat, but is it worth sacrificing that much space? I typically have a VM or two running, and in the host OS and VMs, today's software is hungry for RAM and it's typically cheap and upgradeable outside of the Apple ecosystem.
That's an architectural limitation of the base M4 chip, if you go up to the M4 Pro version you can get up to 48GB, and the M4 Max goes up to 128GB.
The M4 Pro goes up to 48 GB
The M4 Max can have up to 128 GB
The M4 Pro with 14‑core CPU & 20‑core GPU can do 48GB.
If you're looking for ~>36-48GB memory, here's the options:
$2,800 = 48GB, Apple M4 Pro chip with 14‑core CPU, 20‑core GPU
$3,200 = 36GB, Apple M4 Max chip with 14‑core CPU, 32‑core GPU
$3,600 = 48GB, Apple M4 Max chip with 16‑core CPU, 40‑core GPU
So the M4 Pro could get you a lot of memory, but less GPU cores. Not sure how much those GPU cores factor in to performance, I only really hear complaints about the memory limits... Something to consider if looking to buy in this range of memory.
Of course, a lot of people here probably consider it not a big deal to throw an extra 3 grand on hardware, but I'm a hobbyist in academia when it comes to AI, I don't big 6-figure salaries :-)
M4 Max 14 core has a single option of 36GB.
M4 Max 16 core lets you go up to 128GB.
So you can actually get more ram with the Pro than the base level Max.
Although machines with Apple Silicon swap flawlessly, I worry about degrading the SSD, which is non-replaceable. So ultimately I pay for more RAM and not need swapping at all.
wot, m8? Only Apple will call a 12 megapixel camera “advanced”. Same MPs as an old iPhone 6 rear camera.
Aside from that, it’s pretty much the same as the prior generation. Same thickness in form factor. Slightly better SoC. Only worth it if you jump from M1 (or any Intel mbp) to M4.
Would be godlike if Apple could make the chip swappable. Buy a Mac Studio M2 Ultra Max Plus. Then just upgrade SoC on an as needed basis.
Would probably meet their carbon neutral/negative goals much faster. Reduce e-waste. Unfortunately this is an American company and got to turn profit. Profit over environment and consumer interests.
There will always be a long tail of niche Windows games (retro + indie especially). But you can capture the Fortnite (evergreen) / Dragon Age (new AAA) audience.
1) Either Apple wants to maintain the image of the Macbook as a "serious device", and not associate itself with the likes of "WoW players in their mom's basement".
2) Microsoft worked something out with Apple, where Apple would not step significantly on the gaming market (Windows, Xbox). I can't think of another reason why gaming on iOS would be just fine, but abysmal on MacOS. Developers release games on MacOS _despite_ the platform.
I blame the confusion to PC&Android marketing people who were pushing for years and years the idea that the higher the megapixel digits the better the camera is. Non-Apple customers should be really pissed of for the years of misinformation and indoctrination on false KPI.
The marketing gimmicks pushed generations of devices to optimize for meaningless numbers. At times, even Apple was forced to adopt those. Such a shame.
Wish the nano-texture display was available when I upgraded last year. The last MacBook I personally bought was in 2012 when the first retina MBP had just released. I opted for the "thick" 15" high-res matte option. Those were the days...
> MacBook Air: The World’s Most Popular Laptop Now Starts at 16GB
> MacBook Air is the world’s most popular laptop, and with Apple Intelligence, it’s even better. Now, models with M2 and M3 double the starting memory to 16GB, while keeping the starting price at just $999 — a terrific value for the world’s best-selling laptop.
I’ve been using the exact model for about a year and I rarely find limitations for my typical office type work. The only time I’ve managed to thermally throttle it has been with some super suboptimal Excel Macros.
I believe the rumor is that the MacBook Air will get the update to M4 in early spring 2025, February/March timeline.
That said, I believe you. Some press gets a hands-on on Wednesday (today) so unless they plan to pre-announce something (unlikely) or announce software only stuff, I think today is it.
Also, Studio and Pro are hanging there.
The streaming apps virtually all support downloading for offline viewing on iPhone, but the Apple TV just becomes a paperweight when the internet goes out, because I'm not allowed to use the 128GB of storage for anything.
If they're not going to let you use the onboard storage, then it seems unlikely for them to let you use USB storage. So, first, I would like them to change their app policies regarding internal storage, which is one of the purely software improvements I would like to see.
But there are some cases like e.g. watching high-res high-FPS fractal zoom videos (e.g. https://www.youtube.com/watch?v=8cgp2WNNKmQ) where even brief random skipped frames from other things trying to use WiFi at the same time can be really noticeable and annoying.
Unfortunately Apple won’t tell you until the day they sell the machines.
Only in US it seems. India got a price increase by $120.
Makes me wonder what else will be updated this week (Studio or Mac Pro)?
And yes, with enough RAM, it is a surprisingly good dev laptop.
Just kidding! As an Apple Shareholder I feel like you should take what Apple gives you and accept the price. ;)
I might still keep it another year or so, which is a testament to how good it is and how relative little progress has happened in almost 10 years.
I've heard it's easier to just use cloud options, but I sill like the idea of being able to run actual models and train them on my laptop.
I have a M1 MacBook now and I'm considering trading in to upgrade.
I've seen somewhat conflicting things regarding what you get for the money. For instance, some reports recommending a M2 Pro for the money IIRC.
I hope that the response times have improved, because it has been quite poor for a 120 Hz panel.
The so-called "antiglare" option wasn't true matte. You'd really have to go back to 2008.
For my current laptop, I finally broke down and bought a tempered glass screen protector. It adds a bit of glare, but wipes clean — and for the first time I have a one-year-old MacBook that still looks as good as new.
(I tend to feel if you want something specialized, you gotta pay for the expensive model)
You can tell not because the system temp rises, but because suddenly Spotify audio begins to pop, constantly and irregularly.
It took me a year to figure out that the system audio popping wasn't hardware and indeed wasn't software, except in the sense that memory (or CPU?) pressure seems to be the culprit.
Even when I remove all "Intel" type apps in activity monitor, I still experience the issue though.
This is nice, and long overdue.
It sounds more exciting than M4 is 12.5% faster than M3.
It's actually interesting to think about. Is there a speed multiplier that would get me off this machine? I'm not sure there is. For my use case the machine performance is not my productivity bottleneck. HN on the otherhand... That one needs to be attenuated. :)
I've just ordered an (almost) top-of-the-range MBP Max, my current machine is an MBP M1-max, so the comparisons are pretty much spot-on for me.
Selling the M1 Ultra Studio to help pay for the M4 MBP Max, I don't think I need the Studio any more, with the M4 being so much faster.
I'd really like to justify upgrading, but a $4k+ spend needs to hit greater than 2x for me to feel it's justified. 1.8x is still "kind of the same" as what I have already.
Back when Moore's law was still working they didn't skip generations like this.
M4 is built with TSMC's 2nd Gen 3nm process. M3 is on the 1st gen 3nm.
For the base M3 vs base M4:
- the CPU (4P+4E) & GPU (8) core counts are the same
- NPU perf is slightly better for M4, I think, (M4's 38TOPS @ INT8 vs M3's 18TOPS @ INT16)
- Memory Bandwidth is higher for M4 (120 GB/s vs 102.4 GB/s)
- M4 has a higher TDP (22W vs 20W)
- M4 has higher transistor count (28B vs 25B)
Everyone knows SSDs made a big difference in user experience. For the CPU, normally if you aren't gaming at high settings or "crunching" something (compiling or processing video etc.) then it's not obvious why CPU upgrades should be making much difference even vs. years-old Intel chips, in terms of that feel.
There is the issue of running heavy JS sites in browsers but I can avoid those.
The main issue seems to be how the OS itself is optimized for snappiness, and how well it's caching/preloading things. I've noticed Windows 10 file system caching seems to be not very sophisticated for example... it goes to disk too often for things I've accessed recently-but-not-immediately-prior.
Similarly when it comes to generating heat, if laptops are getting hot even while doing undemanding office tasks with huge periods of idle time then basically it points to stupid software -- or let's say poorly balanced (likely aimed purely at benchmark numbers than user experience).
https://nanoreview.net/en/cpu-compare/apple-m1-vs-amd-ryzen-...
Not all products got the M3, so in some lines this week is the first update in quite a while. In others like MBP it’s just the yearly bump. A good performing one, but the yearly bump.
Great to see Affinity becoming so popular that it gets acknowledged by Apple.
It’s essentially a matte coating, but the execution on iPad displays is excellent. While it doesn’t match the e-ink experience of devices like the Kindle or ReMarkable, it’s about 20-30% easier on the eyes. The texture feels also great (even though it’s less relevant for a laptop), and the glare reduction is a welcome feature.
I prefer working on the MacBook screen, but I nearly bought an Apple Studio Display XDR or an iPad as a secondary screen just for that nano-texture finish. It's super good news that this is coming to the MacBook Pro.
I am probably not the best example to emulate lol.
I will upgrade to M4 Pro and really hate the glare when I travel (and I do that a lot) but at the same time I don't want to lose any quality that the MBP delivers which is quite excellent imho
It's easier to read on it.
Of course I'm rooting for competition, but Apple seems to be establishing a bigger and bigger lead with each iteration.
Another positive development was bumping up baseline amounts of RAM. They kept selling machines with just 8 gigabytes of RAM for way longer than they should have. It might be fine for many workflows, but feels weird on “pro” machines at their price points.
I’m sure Apple has been coerced to up its game because of AI. Yet we can rejoice in seeing their laptop hardware, which already surpassed the competition, become even better.
In January, after researching, I bought an apple restored MBP with an M2 Max over an M3 Pro/Max machine because of the performance/efficiency core ratio. I do a lot of music production in DAWs, and many, even Apple's Logic Pro don't really make use of efficiency cores. I'm curious about what restraints have led to this.. but perhaps this also factors into Apple's choice to increase the ratio of performance/efficiency cores.
I believe that’s the case. Most times, the performance cores on my M3 Pro laptop remain idle.
What I don’t understand is why battery life isn’t more like that of the MacBook Airs when not using the full power of the SOC. Maybe that’s the downside of having a better display.
Curious how you're measuring this. Can you see it in Activity Monitor?
> Maybe that’s the downside of having a better display.
Yes I think so. Display is a huge fraction of power consumption in typical light (browsing/word processing/email) desktop workloads.
I use an open source app called Stats [1]. It provides a really good overview of the system on the menu bar, and it comes with many customization options.
Yes, processor history in the activity monitor marks out specific cores as Performance and Efficiency.
Example: https://i.redd.it/f87yv7eoqyh91.jpg
Is an upgrade really worth it?
If you do any amount of 100% CPU work that blocks your workflow, like waiting for a compiler or typechecker, I think M1 -> M4 is going to be worth it. A few of my peers at the office went M1->M3 and like the faster compile times.
Like, a 20 minute build on M1 becoming a 10 minute build on M4, or a 2 minute build on M1 becoming a 1 minute build on M4, is nothing to scoff at.
I myself don’t need so much performance, so I tend to keep my devices for many, many years.
> All Apple Silicon Macs are in scope, as well as future generations as development time permits. We currently have support for most machines of the M1 and M2 generations.[^1][^2]
https://softwareengineeringdaily.com/2024/10/15/linux-apple-...
1. Nested virtualization doesn't work in most virtualization software, so if your workflow involves running stuff in VMs it is not going to work from within another VM. The exception is apparently now the beta version of UTM with the Apple Virtualization backend, but that's highly experimental.
2. Trackpad scrolling is emulated as discrete mouse wheel clicks, which is really annoying for anyone used to the smooth scrolling on macOS. So what I do is use macOS for most browsing and other non-technical stuff but do all my coding in the Linux VM.
It's 2024, and I still see most Windows users carrying a mouse to use with their laptop.
New video format or more demanding music software is released that slows the machine down, or battery life craters.
Well, I haven’t had even a tinge of feeling that I need to upgrade after getting my M1 Pro MBP. I can’t remember it ever skipping a beat running a serious Ableton project, or editing in Resolve.
Can stuff be faster? Technically of course. But this is the first machine that even after several years I’ve not caught myself once wishing that it was faster or had more RAM. Not once.
Perhaps it’s my age, or perhaps it’s just the architecture of these new Mac chips are just so damn good.
Apple's M1 came at a really interesting point. Intel was still dominating the laptop game for Windows laptops, but generational improvements felt pretty lame. A whole lot of money for mediocre performance gains, high heat output and not very impressive battery. The laptop ecosystem changed rapidly as not only the Apple M1 arrived, but also AMD started to gain real prominence in the laptop market after hitting pretty big in the desktop and data center CPU market. (Addendum: and FWIW, Intel has also gotten a fair bit better at mobile too in the meantime. Their recent mobile chipsets have shown good efficiency improvements.)
If Qualcomm's Windows on ARM efforts live past the ARM lawsuit, I imagine a couple generations from now they could also have a fairly compelling product. In my eyes, there has never been a better time to buy a laptop.
(Obligatory: I do have an M2 laptop in my possession from work. The hardware is very nice, it beats the battery life on my AMD laptop even if the AMD laptop chews through some compute a bit faster. That said, I love the AMD laptop because it runs Linux really well. I've tried Asahi on an M1 Mac Mini, it is very cool but not something I'd consider daily driving soon.)
You say that, but I get extremely frustrated at how slow my Surface Pro 10 is (with an Ultra 7 165U).
It could be Windows of course, but this is a much more modern machine than my Macbook Air (M1) and feels like it's almost 10 years old at times in comparison. - despite being 3-4 years newer.
That said, Intel still has yet to catch up to AMD on efficiency unfortunately, they've improved generationally but if you look at power efficiency benchmarks of Intel CPUs vs AMD you can see AMD comfortably owns the entire top of the chart. Also, as a many-time Microsoft Surface owner, I can also confirm that these devices are rarely good showcases for the chipsets inside of them: they tend to be constrained by both power and thermal limits. There are a lot of good laptops on the market, I wouldn't compare a MacBook, even a MacBook Air, a laptop, with a Surface Pro, a 2-in-1 device. Heck, even my Intel Surface Laptop 4, a device I kinda like, isn't the ideal showcase for its already mediocre 11th gen Intel processor...
The Mac laptop market is pretty easy: you buy the laptops they make, and you get what you get. On one hand, that means no need to worry about looking at reviews or comparisons, except to pick a model. They all perform reasonably well, the touchpad will always be good, the keyboard is alright. On the other hand, you really do get what you get: no touchscreens, no repairability, no booting directly into Windows, etc.
And it's not the same - running Windows natively on Mac would seriously degrade the Mac, while running macOS on a PC has no reason to make it worse than with Windows. Why not buy a PC laptop at that point? The close hardware/OS integration is the whole point of the product. Putting Windows into a VM lets you use best of both.
I'm pretty sure you would never use a Windows PC just to boot into a macOS VM, even if it was flawless. And there are people who would never boot a Mac, just to boot into a Windows VM, even if it was flawless. And no, it's not flawless. Being able to run a relatively old strategy game is not a great demonstration of the ability generally play any random Windows game. I have a Parallels and VMWware Fusion license (well... Had, anyway), and I'm a long time (20 years) Linux user, I promise that I am not talking out my ass when it comes to knowing all about the compromises of interoperability software.
To be clear, I am not trying to tell you that the interoperability software is useless, or that it doesn't work just fine for you. I'm trying to say that in a world where the marketshare of Windows is around 70%, a lot of people depend on software and workflows that only work on Windows. A lot of people buy PCs specifically to play video games, possibly even as a job (creating videos/streaming/competing in esports teams/developing video games and related software) and they don't want additional input latency, lower performance, and worse compatibility.
Even the imperfections of virtual machines aside, some people just don't like macOS. I don't like macOS or Windows at all. I think they are both irritating to use in a way that I find hard to stomach. That doesn't mean that I don't acknowledge the existence of many people who very much rely on their macOS and Windows systems, the software ecosystems of their respective systems, and the workflows that they execute on those systems.
So basically, aside from the imperfections of a virtual machine, the ability to choose to run Windows as your native operating system is really important for the obvious case where it's the operating system you would prefer to run.
Battery life is decent.
At this point I’m not switching from laptop Linux. The machines can even game (thanks proton/steam)
https://browser.geekbench.com/macs/macbook-pro-14-inch-2021-...
https://browser.geekbench.com/v6/cpu/4260192
Both of these CPUs perform well enough that most users will not need to be concerned at all about the compute power. Newer CPUs are doing better but it'd be hard to notice day-to-day.
As for other laptop features... That'll obviously be vendor-dependent. The biggest advantage of the PC market is all of the choices you get to make, and the biggest disadvantage of the PC market is all of the choices you have to make. (Edit: Though if anyone wants a comparison point, just for sake of argument, I think generally the strongest options have been from ASUS. Right now, the Zephyrus G16 has been reviewing pretty good, with people mostly just complaining that it is too expensive. Certainly can't argue with that. Personally, I run Framework, but I don't really run the latest-and-greatest mobile chipsets most of the time, and I don't think Framework is ideal for people who want that.)
those are another two reasons why I can't ignore Apple Silicon
I've had my xps 13 since 2016. Really the only fault I have against it nowadays is that 8gb of ram is not sufficient to run intellij anymore (hell, sometimes it even bogs down my 16gb mbp).
Now, I've also built an absolute beast of a workstation with a 7800x3d, 64gb ram, 24 gb vram and a fast ssd. Is it faster than both? Yeah. Is my old xps slow enough to annoy me? Not really. Youtube has been sluggish to load / render here lately but I think that's much more that google is making changes to make firefox / ublock a worse experience than any fault of the laptop.
My Skylake one (I think that would be 6 years old now?) is doing absolutely fine. My Broadwell one is starting to feel a little aged but perfectly usable, I wouldn't even _consider_ upgrading it if I was in the bottom 95% of global income.
Compiling is very slow on these, but I think I'd avoid compilation on my laptop even if I had a cutting edge CPU?
YMMV.
FWIW, Qualcomm cancelled orders of its Windows devkit and issued refunds before the lawsuit. That is probably not a good sign
My work machine was upgraded from an M1 with 16GB of RAM to an M3 Max with 36GB and the difference in Xcode compile times is beyond belief: I went from something like 1-2 minutes to 15-20 seconds.
Obviously if opening a browser is the most taxing thing your machine is doing the difference will be minimal. But video or music editing, application-compiling and other intensive tasks, then the upgrade is PHENOMENAL.
I upgraded from a 13 pro to a 15 pro expecting zippier performance and it feels almost identical if not weirdly a bit slower in rendering and typing
I wonder what it will take to make Mac/iOS feel faster
I went from an iPhone 13 mini to an iPhone 16 and it's a significant speed boost.
The new camera button is kinda nice though.
I was initially indifferent about the camera button, but now that I'm used to it it's actually very useful.
I know, disabling shadows and customisable animation times ;) On a jailbroken phone I once could disable all animation delays, it felt like a new machine (must add that the animations are very important and generally great ux design, but most are just a tad too slow)
The CPU? Ah, never really felt a difference.
Infuriated by the 13.
The 3.5mm audio thunder bolt adapters disconnect more often than usual. All I need to do is tap the adapter and it disconnects.
And that Apple has now stopped selling them is even more infuriating, it's not a faulty adapter.
I use a small Anker USB-A to USB-C adapter [1]. They're rock solid.
As great as the AirPod Pro 2s are, a wired connection is superior in terms of reliability and latency. Although greatly improved over the years, I still have occasional issues connecting or switching between devices.
Out of curiosity, what's the advantage of a jailbroken iPhone nowadays? I'd typically unlock Android phones in the past, but I don't see a need on iOS today.
Interestingly, the last time I used Android, I had to sideload Adguard (an adblocker). On the App Store, it's just another app alongside competing adblockers. No such apps existed in the Play Store to provide system-level blocking, proxying, etc. Yes, browser extensions can be used, but that doesn't cover Google's incessant quest to bypass adblockers (looking at you Google News).
[0] https://www.audio-technica.com/en-us/ath-m50xsts [1] https://www.amazon.com/Adapter-Anker-High-Speed-Transfer-Not...
I have custom scripts, Ad blocking without VPNs, Application firewalls.
I enjoy having most-full control of my device.
The what? is this the adapter for 3.5mm headphones? If so, you don't have to get Apple made dongles. Third parties make them also.
I'd guess the GPs actual problem is lint in the Lightning port though. Pretty common, relatively easy to clean out too, especially compared to USB-C.
Regardless of either, they both have the same fault.
The connector between the phone and the adapter is poor. It could just be a fault with my phone but I have no way of proving this.
I suspect this sounds like a problem with your specific phone. Never had a problem with any lightning accessories myself.
But it is wild that two years ago running any sort of useful genAI stuff on a MBP was more-or-less a theoretical curiosity, and already today you can easily run models that would have exceeded SotA 2 years ago.
Somewhat ironically, I got into the "AI" space a complete skeptic, but thinking it would be fun to play with nonetheless. After 2 years of daily work with this models I'm starting to be increasingly convinced they are going to become increasingly disruptive. No AGI, but it will certainly reduce a lot of labor and enable things that we're really feasible before. Best of all, it's clear a lot of this work will be doable from a laptop!
I upgraded my M1 MBP to a MacBook Air M3 15" and it was a major upgrade. It is the same weight but 40% faster and so much nicer to work on while on the sofa or traveling. The screen is also brighter.
I think very few people actually do need the heavy MBPs, especially not the web/full-stack devs who populate Hacker News.
EDIT: The screens are not different in terms of brightness.
I can fairly easily get my M1 Air to have thermal issues while on extended video calls with some Docker containers running, and have been on calls with others having the same issue. Kind of sucks if it's, say, an important demo. I mostly use it as a thin client to my desktop when I'm away from home, so it's not really an issue, but if I were using it as a primary device I'd want a machine with a fan.
Air doesn't support 120Hz refresh either.
There's an app that allows to unlock max brightness on Pros (Vivid)[0] even without HDR content (no affiliation).
HDR support is most noticeable when viewing iPhone photos and videos, since iPhones shoots in HDR by default.
I may or may have not seen HDR content accidentally, but I’m not sure.
[0] Hawaii LG Demo: https://www.youtube.com/watch?v=WBJzp-y4BHA [1] Nature Demo: https://www.youtube.com/watch?v=NFFGbZIqi3U
YouTube shows a small red "HDR" label on the video settings icon for actual HDR content. For this label to appear, the display must support HDR. With your M3 Pro, the HDR label should appear in Chrome and Safari.
You can also right-click on the video to enable "Stats for nerds" for more details. Next to color, look for "smpte2084 (PQ) / bt2020". That's usually the highest-quality HDR video [2,3].
You can ignore claims such as "Dolby Vision/Audio". YouTube doesn't support those formats, even if the source material used it. When searching for videos, apply the HDR filter afterward to avoid videos falsely described as "HDR".
Keep in mind that macOS uses a different approach when rendering HDR content. Any UI elements outside the HDR content window will be slightly dimmed, while the HDR region will use the full dynamic range.
I consider Vivid [4] an essential app for MacBook Pro XDR displays.
Once installed, you can keep pressing the "increase brightness" key to go beyond the default SDR range, effectively doubling the brightness of your display without sacrificing color accuracy. It's especially useful outdoors, even indoors, depending on the lighting conditions. And fantastic for demoing content to colleagues or in public settings (like conference booths).
[2] https://www.benq.com/en-us/knowledge-center/knowledge/bt2020... [3] https://encyclopedia.pub/entry/32320 (see section 4) [4] https://www.getvivid.app/
I’m looking forward to the day I notice the difference so I can appreciate what I have.
I can’t understand the people who notice the 120 hz adaptive refresh whatever and one guess is their use is a lot twitchier than mine.
Even 90Hz (like on some Pixels) is substantially better than the iPhone's 60Hz.
I am due to update my Mac mini because my current one can't run Sonoma, but, apart from that, it's a lovely little box with more than enough power for me.
The modern AMD or Intel desktops I've tried obviously are much faster when performing large builds and such but for general computing, web browsing, and so forth I literally don't feel much of a difference. Now for mobile devices it's a different story due to the increased efficiency and hence battery life.
And yes. Web apps are not really great on low-spec machines.
The latest of whatever you have will be so much better than the intel one, and the next advances will be so marginal, that it's not even worth looking at a buyer's guide.
A 16gb model for about a thousand bucks?? I can’t believe how far macbooks have come in the last few years
I always catch myself in this same train of thought until it finally re-occurs to me that "no, the variable here is just that you're old." Part of it is that I have more money now, so I buy better products that last longer. Part of it is that I have less uninterrupted time for diving deeply into new interests which leads to always having new products on the wishlist.
In the world of personal computers, I've seen very few must-have advances in adulthood. The only two unquestionable big jumps I can think of off hand are Apple's 5K screens (how has that been ten years?!) and Apple Silicon. Other huge improvements were more gradual, like Wi-Fi, affordable SSDs, and energy efficiency. (Of course it's notable that I'm not into PC gaming, where I know there has been incredible advances in performance and display tech.)
Where this might shift is as we start using more applications that are powered by locally running LLMs.
but yes, I was looking at and anticipating the max RAM on the M4 as well as the max memory speed
128gb and 546GB/s memory bandwidth
I like it, I don't know yet on an upgrade. But I like it. Was hoping for more RAM actually, but this is nice.
The MacBook Pro seems like it does have some quality of life improvements such as Thunderbolt 5, the camera is now a center stage (follows you) 14 megapixel camera now all of them have three USB-C ports and the battery life claims of 22-24 hours. Regardless if you want a MacBook Pro and you don't have one there is now an argument on not just going to buy the previous model.
Why would people feel the need to upgrade?
And this applies already to phones. Laptops have been slowing for even longer.
Apple has been shipping "neural" processors for a while now, and when software with local inference starts landing, Apple hardware will be a natural place for it. They'll get to say "Your data, on your device, working for you; no subscription or API key needed."
Getting an extra five years of longevity (after RAM became fixed) for an extra 10% was a no-brainer imho.
It is absolutely, 100%, no doubt in my mind: the hardware.
Only recently I noticed some slowness. I think Google Photos changed something and they show photos in HDR and it causes unsmooth scrolling. I wonder if it's something fixable on Google's side though.
The only reason the 2009 one now gets little use, is its motherboard now has some electronic issues, otherwise it would serve me perfectly well.
In fact, I bought a highly discounted Mac Studio with M1 Ultra because the M1 is still so good and it gives me 10Gbit ethernet, 20 cores and a lot of memory.
The only thing I am thinking about is going back to the MacBook Air again since I like the lighter form factor. But the display, 24 GiB max RAM and only 2 Thunderbolt ports would be a significant downgrade.
Other than that it cruises across all other applications. Hard to justify an upgrade purely for that one issue when everything else is so solid. But it does make the eyes wander...
One good use case for 32gb Mac is being able to run 8b models at full precision, something that is not possible with 8-16gb macs
and probably it's good that at least one of the big players has a business model that supports driving that forward
> Up to 7x faster image processing in Affinity Photo when compared to the 13‑inch MacBook Pro with Core i7, and up to 1.8x faster when compared to the 13-inch MacBook Pro with M1.
I feel the same of my laptop of 2011 so I guess it is partly age (not feeling the urge to always have the greatest) and partly it is non LLM and gaming related computing is not demanding enough to force us to upgrade.
The last few years Chrome seems to have stepped up energy and memory use, which impacts most casual use these days. Safari has also become more efficient, but it never felt bloated the way Chrome used to.
That said, they are in a very comfortable position right now, with neither Intel, AMD, or another competitor able to produce anything close to the bang-for-watt that Apple is managing. Little pressure from behind them to push for more performance.
It seems like they bump the base frequency of the CPU cores with every revision to get some easy performance gains (the M1 was 3.2 GHz and the M3 is now 4.1 GHz for the performance cores), but it looks like this comes at the cost of it not being able to maintain the performance; some M3 reviews noted that the system starts throttling much earlier than an M1.
But it's a heavy brick with a short battery life compared to the M1/2/3 Mac.
The base model is perfect. Now to decide between the M3/M4 Air and the M4 Pro.
Only downside is the screen. The brightness sort of has to be maxed out to be readable and viewing at a wrong angle makes even that imperfect
That said it’s about the same size / weight as an iPad Pro which feels much more portable than a pro device
I’ve tried a bunch of ways to do this - and frankly the translation overhead is absolute pants currently.
Not a showstopper though, for the 20-30% of complete pain in the ass cases where I can’t easily offload the job onto a VPS or a NUC or something, I just have a ThinkPad.
Has nothing whatsoever to do with CPU/memory/etc.
Frankly though, if the mac mini was a slightly lower price point I'd definitely create my own mac mini cluster for my AI home lab.
That's me, I don't give a shit about AI, video editing, modern gaming or Kubernetes. That newest and heaviest piece of software I care about is VSCode. So I think you're absolutely correct. Most things new since Docker and VSCode has not contributed massively to how I work and most of the things I do could be done just fine 8-10 years ago.
To me it's more like 3d printing as a niche/hobby.
< 1% of all engagement with a category thing is niche/hobby, yes.
How old are you?
"Bro" has been gender neutral for over a decade. Males and females under the age of 25 call each other "bro" all the time.
Example?
What I do know is that Linux constantly breaks stuff. I don't even think it's treading water. These are interfaces are actively getting worse.
I have a Macbook Air M1 that I'd like to upgrade, but they're not making it easy. I promised myself a couple of years ago I'll never buy a new expensive computing device/phone unless it supports 120 hertz and Wi-Fi 7, a pretty reasonable request I think.
I got the iPhone 16 Pro, guess I can wait another year for a new Macbook (hopefully the Air will have a decent display by then, I'm not too keen to downgrade the portability just to get a good display).
The quality stuff retains value, not brand.
They have the highest product quality of any laptop manufacturer, period. But to say that all Apple products hold value well is simply not true. All quality products hold value well, and most of Apples products are quality.
I guarantee you that if Apple produced a trashy laptop it would have no resell value.
Again, the quality holds the value not the brand.
That said, they did suffer from some self inflicted hardware limitations, as you hint. One reason I like the MBP is the return of the SD card slot.
(Old Pentium Pro, PII, multi chip desktop days) -- When I did a different type of work, I would be in love with these new chips. I just don't throw as much at my computer anymore outside of things being RAM heavy.
The M1 (with 16 GB ram) is really an amazing chip. I'm with you, outside of a repair/replacement? I'm happy to wait for 120hz refresh, faster wifi, and longer battery life.
They always have. If you want an objective measure of planned obsolescence, look at the resale value. Apple products hold their resale value better than pretty much every competitor because they stay useful for far longer.
It also avoids the trouble of using a hosted LLM that decides to double their price overnight, costs are very predictable.
I genuinely want to use it as primary machine but with this Intel MacBook Pro I have, I absolutely dislike FaceTime, IMessage, the need to use AppStore, Apple always asking me have a Apple user name password (which I don't and have zero intention), block Siri, and all telemetry stuff Apple has backed in, stop the machine calling home, etc.
This is to mirror tools available in Windows to disable and remove Microsoft bloatware and ad tracing built in.
Pretty much all the software I use is from brew.
If that's your question, yes - various options exist like https://asahilinux.org
Then Windows 11 came out.
My primary desktop & laptop are now both Macs because of all the malarkey in Win11. Reappearance of ads in Start and Windows Recall were the last straws. It's clear that Microsoft is actively trying to monetize Windows in ways that are inherently detrimental to UX.
I do have to say, though, that Win11 is still more customizable overall, even though it - amazingly! - regressed below macOS level in some respects (e.g. no vertical taskbar option anymore). Gaming is another major sticking point - the situation with non-casual games on macOS is dismal.
Get a PC.
That, combined with the icloud and telemetry BS, I'd had enough.
Not to mention that in the US the cell phone carriers artificially limit tethering speed or put data caps on it when you tether from your phone. You have to buy a dedicated data-only plan and modem.
Most cellular carriers offer unlimited on-device data plans, but they cap data for tethering. Integrating an LTE modem into a laptop essentially requires a mobile data plan with unlimited tethering - which, AFAIK, doesn’t exist at the moment. I’m not sure why.
I've always heard that patent disputes were at the root of the lack of a modem option. Apple had a prototype MacBook Pro back in the early Intel days IIRC but it was never released.
Maybe if Apple ever gets their in-house modems working, we'll see them on all of the product lines, but until then, it's a niche use case that likely isn't causing them to lose a ton of sales.
I understand that. My point is that I think an LTE modem in a laptop might reasonably use far more data than an LTE modem in a phone or tablet. Most people who download and/or upload very large files do so on their computer rather than their mobile devices.
On a side note, anyone know what database software was shown during the announcement?
https://www.datensen.com/data-modeling/luna-modeler-for-rela...
MBP: Apple M4 Max chip with 16‑core CPU, 40‑core GPU and 16‑core Neural Engine
Mac mini: Apple M4 Pro chip with 14‑core CPU, 20‑core GPU, 16-core Neural Engine
What kind of workload would make me regret not having bought MBP over Mac mini given the above. Thanks!
- photo/video editing
- games, or
- AI (training / inference)
that would benefit from the extra GPUs.
I think you need to pick the form factor that you need combined with the use case:
- Mobility and fast single core speeds: MacBook Air
- Mobility and multi-core: MacBook Pro with M4 Max
- Desktop with lots of cores: Mac Studio
- Desktop for single core: Mac mini
I really enjoy my MacBook Air M3 24GB for desktop + mobile use for webdev: https://news.ycombinator.com/item?id=41988340
The base model doesn't support thunderbolt 5.
And the base model still doesn't support more than 2 external displays without the DisplaySync (not DisplayPort!) hardware+software.
"M4 and M4 Pro
Simultaneously supports full native resolution on the built-in display at 1 billion colors and:
Up to two external displays with up to 6K resolution at 60Hz over Thunderbolt, or one external display with up to 6K resolution at 60Hz over Thunderbolt and one external display with up to 4K resolution at 144Hz over HDMI
One external display supported at 8K resolution at 60Hz or one external display at 4K resolution at 240Hz over HDMI"
"The display engine of the M4 family is enhanced to support two external displays in addition to a built-in display."
https://www.apple.com/newsroom/2024/10/apple-introduces-m4-p...
The only downsides is that I see a kind of "burnt?" transparent spot on my screen. When connecting to an HDMI cable, the sound does not ouput properly to the TV screen, and makes the video I plat laggy. Wondering if I go to the Apple Store, would fix it?
Personal anecdote: don't get your hopes up. I've had my issues rejected as 'no fault found', but it's definitely worth spending a bit of time on.
At long last, I can safely recommend the base model macbook air to my friends and family again. At $1000 ($900 with edu pricing on the m2 model) it really is an amazing package overall.
> Up to 4.6x faster build performance when compiling code in Xcode when compared to the 16‑inch MacBook Pro with Intel Core i9, and up to 2.2x faster when compared to the 16‑inch MacBook Pro with M1 Max.
OK, that's finally a reason to upgrade from my M1.
Before the M4 models: omg, Apple only gives you 8GB RAM in the base model? Garbage!
After the M4 models: the previous laptops were so good, why would you upgrade?
M3 pro has 192 bit wide memory, GPU improvements mostly offset the decrease in memory bandwidth. This leads to memory options like 96GB.
M4 pro has 256 bit wide memory, thus the factor of 2 memory options.
DRAM chips don't just come in power of two sizes anymore. You can even buy 24GB DDR5 DIMMs.
:/
APPARENTLY NOT TODAY.
C'mon mannnnn. The 90s/y2k are back in! People want the colorful consumer electronics! It doesn't have to be translucent plastic like it was back then but give us at least something that doesn't make me wonder if I live in the novel The Giver every time I walk into a meetup filled with MacBook Pros.
I'm sure the specs are great.
I look at my local source vs the recording, and I am baffled.
After a decade of online meeting software, we still stream 480p quality it seems.
A huge part of group video chat is still "hacks" like downsampling non-speaking participants so the bandwidth doesn't kill the connection.
As we get fatter pipes and faster GPUs streaming will become better.
edit: I mean... I could see a future where realtime video feeds never get super high resolution and everything effectively becomes a relatively seemless AI recreation where only facial movement data is transmitted similar to how game engines work now.
A specced out Mac Studio (M2 being the latest model as of today) isn't cheap, but it can run 180B models, run them fast for the price, and use <300W of power doing it. It idles below 10W as well.
Looking at how long the 8gb lasted it's a pretty sure bet that now you won't need to upgrade for a good few years.
I mean, I have a MacBook air with 16gb of ram and it's honestly working pretty well to this day. I don't do "much" on it though but not many people do.
I'd say the one incentive a MacBook Pro has over the air is the better a screens and better speakers. Not sure if it's worth the money.
> I mean, I have a MacBook air with 16gb of ram and it's honestly working pretty well to this day. I don't do "much" on it though but not many people do.
If an HN user can get along with 16gb on their MacBook Air for the last X years, most users were able to get by with 8gb.
They are supposed to be "green" but they encourage obsolescence.
8GB is fine for most use cases. Part of my gig is managing a huge global enterprise with six figures of devices. Metrics demonstrate that the lower quartile is ok with 8GB, even now. Those devices are being retired as part of the normal lifecycle with 16GB, which is better.
Laptops are 2-6 year devices. Higher end devices always get replaced sooner - you buy a high end device because the productivity is worth spending $. Low end tend to live longer.
Or you could get a framework and you could actually upgrade parts that are worth upgrading - instead of upgrade as in buying a new one
It's fine, but the issue is linux sleep/hibernate - battery drain. To use the laptop after a few days, I have to plug it in and wait for it to charge a little bit because the battery dies. I have to shut it down (not just close the screen) before flying or my backpack becomes a heater and the laptop dies. To use a macbook that's been closed for months I just open it and it works. I'll pay double for that experience. If I want a computer that needs to be plugged in to work I have a desktop for that already. The battery life is not good either.
Maybe it's better now if I take the time to research what to upgrade, but I don't have the time to tinker with hardware/linux config like I did a few years ago.
I don't really see a world where this machine doesn't last me a few more years. If there's anything i'd service would be the battery, but eh. It lasts more than a few hours and I don't go out much.
I'm getting tired of everything else being updated yet the product most needed is completely being neglected, and for years already.
And no, I don't wanna buy a separate tiny screen for thousands of dollars.
I'm also not interested in these tiny cubes you deem to be cool.
Regarding LLMs, the hottest topic here nowadays, I plan to either use the cloud or return to a bare-metal PC.
yeah it's about time
Are they going to claim that 16GB RAM is equivalent to 32GB on Intel laptops? (/sarc)
> Up to 23.8x faster basecalling for DNA sequencing in Oxford Nanopore MinKNOW when compared to the 16-inch MacBook Pro with Core i9, and up to 1.8x faster when compared to the 16-inch MacBook Pro with M1 Pro.
The linked Apple Store page says "MacBook Pro blasts forward with the M3, M3 Pro, and M3 Max chips" so it seems like the old version of the page still?
This is misleading:
https://news.ycombinator.com/item?id=25074959
"macOS sends hashes of every opened executable to some server of theirs"
To be fair, the link in this story is to a press release. Arguably there are probably many things in it that can be considered "misleading" in certain contexts.
A more charitable interpretation is that Apple only thinks that people with computers a few years old need to upgrade, and they aren't advertising to people with a <1 year old MacBook Pro.
No space grey?!
don’t think it’s wise though, i bought a base m1 pro mbp when it launched and don’t feel a need to upgrade at all yet. i’m holding off for a few more years to grab whenever the next major increase in local llm capability and battery life comes.
Another observation; I've travelled the world and rarely see people who could use robust, secure products the most (vulnerable people) using Apple products. They're all packing second-tier Samsung or LG Androids and old Windows notebooks (there are decent Samsung, LG, Android, Windows products, but that's not what they have access to).
If it affects your earning power to that extent, you should probably pony up and save in the long run, probably just a few years until you see returns.
Caste system usually can't be bypassed by paying a monthly subscription fee.
I will note that making it a subscription will tend to increase the overall costs, not decrease. In an environment with ready access to credit, I think offering on a subscription basis is worse for consumers?
If it matters that much to you just sell the old one and buy the new. That's your subscription.
The compile times for Swift, the gigabytes of RAM everything seems to eat up.
I closed all my apps and I'm at 10gb of RAM being used - I have nothing open.
Does this mean the Macbook Air 8gb model I had 10 years ago would basically be unable to just run the operating system alone?
It's disconcerting. Ozempic for terrible food and car-centric infrastructure we've created, cloud super-computing and 'AI' for coping with this frankenstein software stack.
The year of the Linux desktop is just around the corner to save the day, right? Right? :)
It tells me my computer is using 8gb of RAM after a restart and I haven't begun to open or close anything.
Yikes?
Give us data, tell us whats new, and skip the nonsense buzz filling adjectives.
To quote Russell Brand, just say he sat down, not that he placed his luscious ass in silk covered trousers on a velvetly smooth chair, experiencing pleasure as the strained thigh muscles received respite after gruelling on their feet watching a lush sunset in a cool summers evening breeze.
It would indeed have been nice to see a faster response rate screen, even though I value picture quality more, and it also would have been nice to see even vaguely different colors like the iMac supposedly got, but it seems like a nice spec bump year anyway.