I think this just became the go-to recommendation I'll give to anybody wanting an entry-level desktop computer of any kind. In fact I might buy one for my parents right now to replace the old mac mini they have. I really can't think of any reasonable competition for it at that price.
Best investment you’ll ever make. They’re not all that expensive. Having experienced 4k I feel impoverished having to return to lower resolutions.
I feel it’s a travesty that workplaces spend thousands on fancy desks and chairs and cheap out on bargain basement monitors.
That's what they said. I've been using Retina/HiDPI displays at work for close to a decade now. Still can't say I prefer one over the other. I have no problem seeing pixels, especially now that I've switched to Linux (KDE Plasma) at home. In fact I kind of like being able to catch a glimpse of the building blocks of the virtual world.
What actually does matter (for me) is uniformity and color accuracy. And you can't have that for cheap, especially not in 4K.
Edit: Adding that both of these machines are now running macOS 15.1 at this time.
But that workaround is “patched” on Apple Silicon and won’t work.
So yes if you have an Apple Silicon Mac plugged into a 1440p display, it will look bad with any sort of “scaling”- because scaling is disabled on macOS for sub-4K displays. What you’re actually doing when you’re “scaling” on say a 1440p display is running that display at 1920x1080 resolution- hence it looks like ass. Back before Apple Silicon, running that 1440p display at “1920x1080” was actually just scaling the UI elements up to appear as though you had a 1920x1080 display- since it was still utilizing the full …x1440 pixels of the display, “1920x1080” looked nicer than it would now.
So brass tacks it’s just about how macOS/OS X would obfuscate the true display resolution in the System Preferences -> Displays menu. Now with Apple Silicon Macs, “1920x1080” means “2x scaling” for 4K monitors and literally “we’ll run this higher-res monitor at literally 1920x1080” for any display under 4K resolution.
The same way someone might not notice motion smoothing on a TV, or how bad scaling and text rendering looks on a 1366*768 panel, or different colour casts from different display technologies. All three took me a while before I could tell what was wrong without seeing them side by side.
Does any of that matter, though? Who bothers with the existence of hypothetical artifacts in their displays they cannot even see?
If you say it looks fine without it, I don't know what to say.
In short: you probably want to get at least a 4k display anyway, but if you want to delay that, you should buy BetterDisplay. The difference is night and day.
I agree, it works… fine. But sadly more and more elements of modern macOS will look blurry / aliased because they are only made with hi-DPI in mind.
For example all SF Symbols, as far as I know, are not defined as pixel graphics but only stored as vectors and rasterized on the fly. Which works great at high res and makes them freely scalable, but on low-DPI displays they certainly look worse than a pixel-perfect icon would.
The fact that so many seem to tolerate "low-res" or "mid-res" displays on the current M-series Macs is really puzzling to me... maybe my eyesight isn't as bad as I thought it was and everyone else's is a lot worse!?
This new M4 mini is tempting enough that I might try a Mac again... but this time I am definitely going to have to budget for a 4k/5k display.
> something like a 1440p monitor will look much worse on macOS than it would on Windows or Linux.
https://www.amazon.com/BOSGAME-5700U-Displays-Computers-Emul... https://www.amazon.com/Beelink-SER5-Desktop-Computer-Graphic...
I know a couple of iOS developers who recently switched to a M4 MacBook pro and they swear that in some frequent workloads it feels sluggish and slower than the old Intel MacBook pros. Being RAM-starved might have something to do with it though.
> but there's loads of mini PCs with decent CPUs, 32GB RAM and a 1TB of SSD storage for under $600.
I also add that, unlike Apple hardware, these miniPCs are built with extensibility in mind. For example, most NUCs from the likes of minisforum and Beelink ship with a single SSD but support multiple SSDs, with their cases also having room for SATA drives. They even go as far as selling barebones versions of their NUCs, where customers can then pick and choose which RAM and SSDs to add.
You'll be able to sell your M4 mac mini in 5 years for $150 for an instant-cash offer from backmarket or any other reseller, while you'd be lucky to get $30 for the equivalent Beelink or BOSGAME after 6 months on ebay.
These are the dollar numbers claimed in the above post.
Your hypothetical scenario is so absurd you didn't even noticed your catastrophical scenario falls well within any warranty period.
In the meantime, I own cheap miniPCs that I used as daily drivers, and they soldier on for years.
I mean, do you really believe that NUCs from Intel and AMD will simply fall apart if they don't have an Apple sticker on them?
Macs do generally hold their resell value better than PCs, but that doesn’t necessarily have any correlation to usefulness.
I have bought several ThinkCenter small form factor PCs used for about $200 each, and they’ve each been about 5-7 years old. They’re perfectly fine and I can even get new parts from Lenovo, depending on the part and machine. Fantastic deal. They run loads of services in my home.
This reads like the epitome of Apple's reality distortion field. I mean, you're trying to convince yourself that a product is not overpriced when compared to equivalent products and subjecting customers to price gauging by asserting that you might be able to sell it later. That's quite the logical leap.
If you want to put in a bit of elbow grease, you can get a much better deal. M1 Mac Minis in my area are regularly selling for $350+ on FB Marketplace right now.
My macbook pro 15 inch, mid 2017 is valued at $195 by apple trade in. Bought for 2k iirc.
I owned a 2014 MBP (~$1200?) for a long time and as late as 2019 it was resellable for $500.
Nice size. The Beelink has better reviews. Any name brands?
It's a nice little low wattage machine for running some docker containers.
Ryzen one I got 2 years ago for my dad outperforms my M1 pro.
I got a BRIX, which gave me nothing but trouble. Its UEFI is very picky with SSD brands, wasted money on a couple now being used as external drives, and in the end not even with Windows.
It is now collecting dust waiting for the city hall disposal round.
There are many other manufacturers. I am biased towards fanless builds, like those two.
NUCs are also a great option, especially if you replace the case with a fanless Akasa one.
It starts €599,00 for 2(!) core Celeron. Seems absurd when you can get a Mini for an extra €100 (you can run Linux/Windows in a VM and still get a magnitude or few better perf). Or even an used old NUC or something, you'd need to go back very far to get a crappier CPU...
So the actual starting price seems to be €900-1000 (i.e. if you want an i5..)
The Celeron G6900 has a 46W TDP and seems to be around ~20% (multicore) slower than the <10W N100. Seems absurd that they are pushing garbage like that at such prices (even if its the base config)
I am not endorsing any particular brand, but Cirrus7 is not that expensive within the fanless market and the quality of the entire build is very high. They also somtimes offer nice discounts for students and SMEs. There are quite a few comparable brands and also DIY options with cases from Streacom or Akasa. If you want something cheaper, Minix is pretty inexpensive, especially when you take into consideration they offer a decent fanless enclosure.
I still find it weird/confusing why would a reasonably high-end brand be selling configs with such horrible CPUs (especially perf/watt considering the whole fanless thing).
But I suppose they hardly have any options if they want a socketed MB. Laptop chips would probably be a lot better value (both cost and heat wise) but then it's no longer modular and e.g. Lunar Lake doesn't(?) even support non soldered-memory...
Why exactly?
What are a "lot of people" storing on their computers these days? Photos are in the cloud or on our phones. Videos and music are streaming. Documents take up no space. Programs are in the cloud (for the most part).
Install your favorite flavor of Linux then. Beelink devices have a good reputation for being quite happy with a new OS. It's more compatible that the latest Apple devices, that's for certain.
The old Intel machines made excellent Linux boxes, excepting the TouchBar era because the TouchBar sucked (it was possible to install Linux, it would display the fake function keys, they worked, but not a good experience). I've converted two non-TouchBar Mac laptops into Linux machines, with zero complaints, one of them is in current use (not the laptop I'm typing on this instant however).
Now there's Asahi, which as a sibling comment points out, will surely be supported for M4 eventually. This is a great time to buy the M2 Minis and put Linux on them, if that's what you're into. Or you can wait around for the M4 port, whatever suits your needs.
This is squarely in the NUC/SFF/1l-pc territory, and there is plenty of competition here from Beelink and Minisforum.
I just found the Beelink SER7 going for $509, and it has an 8-core/16-thread Ryzen 7 CPU, 32GB DDR4. The 8845 in the beelink is very competitive[1] with M4 (beaten, but not "easily"), and also supports memory upgrades of up to 256GB.
1. https://nanoreview.net/en/cpu-compare/apple-m4-vs-amd-ryzen-...
Neither gets you any kind of useful warranty, at least for most people, who are unwilling to deal with overseas companies.
Apple has actual physical stores, and a phone number you can call.
I anticipated this concern, the $509 I gave earlier is the Amazon price that includes the mark-up. The Beelink SER7 costs only $320 on AliExpress.
Modern solid-state electronics are very reliable, most reliability issues for electronics are related to screens or batteries; which desktop computers lack. I guess there was a bad-capacitor problem over a decade ago, but nothing since then. If your risk-aversion for a desktop computer is high, you pay the Apple premium (possibly buying Applecare), or self-insure by buying 2 SER7s for nearly the same price ($640) as one regular M4 Mac Mini and keep the other one as a spare.
But that's aside from the main topic which was the personal and home use case. On that topic you get a decent set of products as well such as Pages/Numbers/etc. and others along with software support for the Mac Mini. I'm guessing the Beelink runs on Linux? That may be hard for some to work with (which is unfortunate since it's really not), or maybe they have to separately buy a Windows license? Something to consider in the comparison.
If it's audio/video, spawning VMs, it doesn't matter much. If it's for generative software, it might become an issue.
1) external storage to become faster and cheaper every year (subject to constraints around interface)
2) more and more digital assets to be cloud-native, e.g. photos stored exclusively on icloud and not on your computer
So I'm less worried about storage than some. If Asahi Linux achieves Proton-like compatibility with games [0], then we're getting closer to the perfect general purpose game console.
[0] https://asahilinux.org/2024/10/aaa-gaming-on-asahi-linux/
If you need/require Thunderbolt 5, you'll have to step up from $599 to $1399+ for the M4 Pro-based mac mini's.
• Risk of accidental unplugging.
• Contacts may become wonky over time → see above.
• The need to sacrifice a port (or the portion of one in the case of a dongle).
• Enclosures tend to have annoying IO lights.
• Takes a bit of space.
All of these can be solved, especially when dealing with a desktop that stays in place. Paradoxically, there was never a better time to be modest with internal storage.
Although I will say:
> photos stored exclusively on icloud and not on your computer
Over my dead body :) If there’s one thing I’ll always happily carve out SSD space for, it’s local copies of my photo library!
This is quite possibly the dumbest comment I have ever read on Hacker News. Congratulations!
Apple just charge a lot of money for upgrades, even did when it was trivial to do them yourself, and they're not going to change once they made it impossible to do any kind of internal upgrade.
These days the only reasons I see to get a desktop are
1. You need some combination of power/thermals or expandability
2. Kiosks, public computers, etc
3. Cost? Maybe?
For pretty much any regular person in my life who's open to a mac, I'd point them towards a MacBook Air
1. can use a smartphone for all mobile tasks
2. see better on a large screen
3. are more comfortable with a mouse than a trackpad
4. don't have to worry about spilling tea on a laptop or dropping it on the floor. A keyboard is cheap to replace if that happens.
Mac mini with M4 starts at $599 (U.S.) and $499 (U.S.) for education.
Mac mini with M4 Pro starts at $1,399 (U.S.) and $1,299 (U.S.) for education.
> Mac mini with M4 starts at $599 (U.S.) and $499 (U.S.) for education. Additional technical specifications are available at apple.com/mac-mini.
Not really. Do a quick googling for cheap miniPCs from brands such as minisforum or Beelink. Years ago they were selling Ryzen5 and Intel i5 with 16BG of RAM for around $300. No "educational software" bullshit either, just straight from Amazon to anyone who bothered to click on a button.
But I wouldn't recommend it to people who are not used to it.
I tried to recommend Linux, with XFCE setup as basically windows, and people complain. Same for ChromeOS.
> go-to recommendation I'll give to anybody wanting an entry-level desktop
Can anybody get it with educational pricing?
For half that price I can get a used Dell/HP/Lenovo mico/tiny PC with a full i7 CPU, 16GB RAM, 256SSD.
Still good to see. Great for an office PC or HTPC.
Perhaps you should check out some Beelink and GMKTec Mini PC Systems.
[0] https://www.intel.com/content/www/us/en/products/sku/186605/...
https://www.cpubenchmark.net/cpu.php?cpu=Intel+Core+i9-9900K...
https://www.cpubenchmark.net/cpu.php?cpu=Apple+M4+10+Core&id...
You’re comparing a used Tower to an Apple TV sized device.
I wonder why Apple bothers making small, efficient things when you don’t need it?
new is new and has legal ramifications, you cannot compare them unless you're throwing in a trustworthy extended warranty that matches -- and pretty much nothing matches apple in that regard
Why start off by calling people fanboys? It seems like you're looking for an argument instead of a genuine conversation.
The lot of us grew out of gaming in our teens and do real work now.
I don't think Windows fanboys understand real people care about more than numbers.
Is the whole "unified" RAM a reason that the iMac and Mini are capped at 32G?
Unified memory is much more useful when you can get more bandwidth to it.
GPU cores are generally identical between the iGPUs and the discrete GPUs. Adding a PCIe bus (high latency and low bandwidth) and having a separate memory pool doesn't create new opportunities for optimization.
On the other hand having unified memory creates optimization opportunities, but even just making memcpy a noop can be useful as well.
Not really? Apple is efficient because they ship moderately large GPUs manufactured on TSMC hardware. Their NPU hardware is more or less entirely ignored and their GPUs are using the same shader-based compute that Intel and AMD rely on. It's not efficient because Apple does anything different with their hardware like Nvidia does, it's efficient because they're simply using denser silicon than most opponents.
Apple does make efficient chips, but AI is so much of an afterthought that I wouldn't consider them any more efficient than Intel or AMD.
I wonder if a little cluster of Mac Minis is a good option for running concurrent LLM agents, or a single Mac Studio is still preferable?
On the higher end, building a machine with 6 to 8 24GB GPUs such as RTX 3090s would be comparable in cost (as well as available memory) to a high-end Mac Studio, and would be at least an order of magnitude faster at inference. Yes, it's going to use an order of magnitude more power as well, but what you probably should care about here is W/token which is in the same ballpark.
Apple silicon is a reasonable solution for inference only if you need the most amount of memory possible, you don't care about absolute performance, and you're unwilling to deal with a multi-GPU setup.
https://www.apple.com/mac-studio/specs/
Edit: since my reply you have edited your comment to mention the Studio, but the fact remains that the M2 Max has at least ~40% greater bandwidth than the number you quoted as an example.
The M4 Pro in the Mini has a bandwidth of 273 GB/s, which is probably less appealing. But I wonder how it'd compare cost-wise and performance-wise, with several Minis in a little cluster, each running a small LLM and exchanging messages. This could be interesting for a local agent architecture.
Apple's chips have the advantage of being able to be specced out with tons of RAM, but performance isn't going to be in the same ballpark of even fairly old Nvidia chips.
One more edit: I'd also like to point out that memory bandwidth is important, but not sufficient for fast inference. My entire point here is that Apple silicon does have high memory bandwidth for sure, but for inference it's very much held back by the relative slowness of the GPU compared with dedicated nVidia/AMD cards.
It's definitely not what you'd want for your data center, but for home tinkering it has a very clear niche.
Is it? This is very subjective. The Mac Studio would not be "fast enough" for me on even a 70b model, not necessarily because its output is slow, but because the prompt evaluation speed is quite bad. See [0] for example numbers; on Llama 3 70B at Q4_K_M quantization, it takes an M2 Ultra with 192GB about 8.5 seconds just to evaluate a 1024-token prompt. A machine with 6 3090s (which would likely come in cheaper than the Mac Studio) is over 6 times faster at prompt parsing.
A 120b model is likely going to be something like 1.5-2x slower at prompt evaluation, rendering it pretty much unusable (again, for me).
[0] https://github.com/XiongjieDai/GPU-Benchmarks-on-LLM-Inferen...
https://machinelearning.apple.com/research/neural-engine-tra...
I hadn’t used a PC in so long, I still thought that bios setting decided the division. TIL.
Lucky we have Asahi Lina to clarify the details.
e.g. My Ryzen iGPU reserves 2GB/32GB for itself (which Windows can't see) via BIOS and use 9 more as shared "unified" memory.
The base model is 256 gb. You can see it here:
Apples segmentation takes place in the nand firmware. The firmware contains the location in the storage configuration. And this may or may not be rewriteable. Iboffrc has done a video explaining how some of it works. It's all from reverse engineering though.
The top of the line is also not where Apple is gouging the worst. It's in the middle tiers that are actually relevant to many more people. Most don't have a need for 4+ TB main drives but 1-2 TB is a size that's pretty easy to justify for a lot of people and Apple's price is the only option for them and they're absolutely lining their pockets with cash at the expense of anyone not going for the bargain bin basic tier that can't hold 2 modern games.
It's worse than that -- 4TB gen 4 drives can be had for well under $300, sometimes $225-250, and that's for buying a drive outright, not "trading up" from a 256GB device. I think it'd be more accurate to say that you can get double the capacity for a _quarter_ of the price.
I also, as a side note, try to give the loosest most favorable (to my opposite) comparison because when I err on my side it becomes a "well actually" debate a lot of the time about how it's "not quite X times as many it's more like X-1 (so I'm not even going to touch that X-1 is still quite bad)" that is really tedious and annoying especially when the favorable version of the comparison is still quite bad for their point/side.
Except Apple wanted $3,000 for 7TB of SSD (considering the sticker price came with a baseline of 1TB).
I bought a 4xM.2 card and 4x2TB Samsung Pro SSDs, cost me $1,300, I got to keep the 1TB "system" SSD, and was faster, at 6.8GBps versus the system drive at 5.5.
Similar with memory. OWC literally sells the same memory as Apple (same manufacturer, same specifications. Apple also wanted $3,000 for 160GB of memory (going from 32 to 192). I paid $1,000.
Alternatively, you can get one of these[1] external Other World Computing NVME SSDs for USD1,190 right now. And then you can easily move all your files from your laptop to your desktop when you get home.
[1] https://eshop.macsales.com/item/OWC/US4EXP1MT08/ (15% off list price as of writing)
I'm considering getting one and a nice big monitor or TV. It needs to run x-plane 12 at decent speeds and maybe support a bit of light gaming. My macbook M1 pro is actually pretty decent for this but the screen is too small for me to easily read the instruments. I expect this will do better even in the base setup.
Otherwise my needs are pretty modest. I'd love to see steam add some emulation support for these things as I have some older games that I enjoy playing. I currently play those on a crappy old intel laptop running linux. I've also been eyeing a new AMD mini PC with the latest amd stuff (Beelink's SER9).
Seems pretty nice as well and seems like it is more performance for the money. Apple is doing its usual thing of charging you hundreds of euros for 50 euro upgrades. Get the base mac studio instead. It probably makes more sense if you are going down that path.
As a bonus, you can back up your computers and iDevices to the shared local storage instead of paying for (probably much slower to access) cloud storage.
How do you love your internet speed compared to the internal stuff?
Upgrade your memory and connect it externally over USB-C. It works brilliantly
Fortunately I don't really see the point of using a mac mini, so this doesn't bother me too much, but... it's poor taste. You're holding it wrong was not cool the first time.
The issue is that Apple moved the storage controllers into their SOC. So they use raw nand chips, and you need to use ones that the SOC supports.
I can't imagine anyone but Apple shareholders drooling at the taught of overpriced soldered memory would prefer a smaller Mac Mini case if ~0.5" more height would get you M.2 bays for storage.
[1] https://eshop.macsales.com/shop/external-drives/owc-ministac...
For example: https://www.amazon.com/dp/B08S47KBMC/
We really are living in the future if people are using these words in combination.
Though compared to this new mini a lot will feel clunky. Any HDD enclosure is certainly larger.
There is a reason for the popularity of those enclosure/hub combos that have the same footprint and color as the Mini.
This basically proves that Apple shot themselves in the foot for AI on mobile by artificially restricting RAM for so long! Heck, even the Neural Engine has turned out to be basically useless despite all their grandstanding.
So alas, their prior greed has resulted in their most popular consumer iDevices being the least AI compatible devices in their lineup. They could’ve leapfrogged every other manufacturer with the largest AI compatible device userbase.
What they shot was us. My 14 Pro won’t do AI despite having a better NPU than an M1, all because Apple chose - intentionally - to ship it with too little RAM. They knew AI was coming and they did this anyway.
Although having played with it on my MBP it’s clear I’m not missing much. But still.
It's Unified RAM. So that memory is also used for the GPU & Neural Cores (which is for Apple Intelligence).
This is actually why companies moved away from the unified memory arch decades ago.
It'll be interesting to see as AI continues to advance, if Apple is forced to depart from their unified memory architecture due to growing GPU memory needs.
Today, the industry is moving toward unified memory. This trend includes not only Apple but also Intel, AMD with their APUs, and Qualcomm. Pretty much everyone.
To me, the benefits are clear:
- Reduced copying of large amounts of data between memory pools.
- Improved memory usage.
- Generally lower power consumption.
I don't understand - wouldn't the OS be able to do a better job of dynamically allocating memory between say GPU and CPU in real time based on instantaneous need as opposed to the buyer doing it one time while purchasing their machine? Apparently not, but I'm not sure what I'm missing.
The usual reasoning that people give for it being bad is: you share memory bandwidth between CPU and GPU, and many things are starved for memory access.
Apple’s approach is to stack the memory dies on top of the processor dies and connect them with a stupid-wide bus so that everything has enough bandwidth.
And besides, what Apple is doing is placing the RAM really close to the SoC, I think they are on the same package even, that was not the case on the PC either AFAIK?
1. Interesting that they did not have this as part of an event. I think this either means they do not have much else to share around the Mac right now or the opposite, there just won't be room to talk about the iMac or Mac Mini. I am leaning towards the former as a I suspect the other computers in their lineup will just receive a spec bump soon.
2. On the product page (https://www.apple.com/mac-mini/) Apple highlights a number of third party accessories. Notably the PS5 controller and several keyboards and mice from different manufacturers. This seems small, but it would have been almost blasphemy under the jobs era.
3. This is quite the little powerhouse. Honestly it is so good it eliminates the need for most people to even consider the Mac Studio.
I feel like Jobs was a lot more pragmatic than we give him credit for. I mean we had the HP iPods, iTunes on Windows etc.
And the iMac's catch copy was "BYODKM" at the very start, fully putting the spotlight on third parties and composability.
and you can remoteplay to your ps5 (or ps4) which also works great if your internet isnt terrible
Mac Studio and Mac Pro are getting upgrades next year apparently.
The mac book air with the M chip was absolutly a steal already. I'm surprised by this.
Is that some thing to allow cheaper MX / Arm architecture in DCs? Is getting Apple affordable oO?!
Would be interesint to know though if the margin stayed the same and they literaly just save a lot of money.
My M2 Mac Mini that I got for $499 is my favorite gaming computer I've had in a long time. Runs many games like WoW, Dota, League of Legends, etc great. Anything that it doesn't run due to MacOS I use GeForce Now over ethernet. And this was with 8gb unified memory, now with 16gb it'll be even better value.
Very excited to see how the GPU has improved in the M4, especially the Pro model.
Linux GeForce Now can only do 720p or 1080p, can't remember which. Also, it's just kind of laggy in desktop mode. The Macs run so much smoother.
My current "main" desktop is actually my Asus ROG Ally. I use one USB C hub that is capable of 4k120hz, and I can move it between my Mac laptops and Asus ROG Ally very seamlessly.
The problem for me is Windows. Yesterday my start menu stopped loading for some reason and required a full reboot. Sometimes it refuses to go to sleep. Sometimes it refuses to come out of sleep. Sometimes a Windows update kicks off in the middle of a game and it slows everything to a crawl. Windows drives me crazy these days!
At least it boots.
I purchase a Surface Pro 8 a year ago or something, thinking Windows would surely work better than usual when it is Microsoft's own hardware too.
But no, yesterday it got stuck in a boot loop, after a Windows update broke the audio drivers somehow. The Windows logs/reliability report can just tell me it "shut down abnormally" without any technical details what so ever.
I still have to use Windows on my desktop because of Ableton, but I'll never purchase any Microsoft hardware again, and as soon as I can, I'll run Ableton on Linux like the rest of my software.
Here's the exact model I have: https://www.amazon.com/Cable-Matters-Ethernet-Delivery-Charg...
Perennial truth since XP
Vista was okay after hardware caught up and it got a few patches.
10 was a huge improvement over 8.x, but 11 has had a lot of bugginess for me, particularly related to the new Start menu.
These Cable Matters ones work well, but need their firmware flashed. Luckily, it was extremely easy and worked great for me. Here's an Apple forum thread about it: https://forums.macrumors.com/threads/dp-usb-c-thunderbolt-3-...
Here's the exact model I have: https://www.amazon.com/Cable-Matters-Ethernet-Delivery-Charg...
Neither MacOS nor Windows are very good console OSes - you're really better off using Linux where anticheat isn't concerned. Even on the Ally.
They're great OSes for consumers who don't really work on their computers, and just want something that caters to the lowest common denominator.
For professionals who use computers for work, Linux is really the only option that doesn't eventually get in your way. You can set it up and leave it as-is, with only security updates, and everything keeps working the same way, basically forever.
I've tried to set up an experience like that on both macOS and Windows, but eventually, the company will find a way of forcing an update on your, intentionally or not.
Even for non-Gaming use cases this idea is a bit dated. Printing is by far the best experience on Linux. The "tweaking" that you need to do, that every Windows/MacOS user claims, isn't really a thing these days - sans NVIDIA (I'm not sure what the current status is, but it was bjorked somewhat recently). Sure, if you want to go beyond what Windows/MacOS can offer then tweaking my be required, but the current UIs are extremely comprehensive.
I had a 80yr old lady up and running in one day with PopOS. If that's not lowest common denominator, I don't know what is.
Professional work can be hit and miss. Depends on how draconian your workplace software is.
I really hope you're not expecting anyone to take you seriously with this. On principle I get what you're saying but in practice no one who works as a professional in any field has the time (or expertise) to be worried about configuring their operating system.
As a Linux evangelist who begrudgingly daily drives a Mac, this kind of attitude is what does us in. It's the cocksure "akshually Linux is best" even when it materially, experientially, just isn't.
Denial is not a design ethos.
Linux is best because it lets you use your computer for whatever workflow you need.
I think it's a good thing for 99% of computer users to not be able to just run any random software they download off of the internet. Gatekeeper, XProtect, and notarization are unfortunately necessary in the hostile computing environment we live in today. Aunt Tilly will happily download "PhotoShop" from that sketchy Russian Warez site and infect her machine if these protections didn't exist.
For power users that know what they're doing it is trivial to just use something like Homebrew or to bypass these protections on a case by case basis as needed. I can also run software in a Linux VM quite easily as well for open source software that isn't well maintained on macOS.
Professionals should absolutely take it seriously because time spent updating Windows or even just waiting around while it gets its shit together is time you could have spent doing your job and making money. In fact, Windows and its spontaneous updates with obnoxious focus stealing prompts are major risks to the integrity of your work and might cause you to have to redo it from scratch, lowering the value of your time even further.
Linux boots in less than ten seconds and is already ready to use. There are distributions for all levels of expertise, and if there's an IT department it should be managing those boxes anyway. All that's missing is the Microsoft Office suite and in the end that's what the Windows vs Linux battle always boils down to. People put up with it because they just need muh Excel.
1. Requires Windows Pro or Apple Developer license to unlock full featureset
2. Cannot reasonably disable targeted advertising or ad data collection from either OS
3. Neither come with package managers and do not respect third-party packaging either
4. Can be "managed" insofar as your buggy CPM software allows, often glitched by the OS itself
5. The experience is always getting worse since Apple and Microsoft share a united front of making people spend as much money on useless shit as humanly possible
Now, that's not to say nobody should use these OSes - certainly people are locked into them for some purposes. But as a programmer it's genuinely hard for me to be productive on these OSes because I end up fighting them just for everyday, non-programming purposes.
I think it's entirely possible that MacOS and Windows can be inherently terrible experiences while also being mandatory for certain workflows.
1. An Apple Developer license is only required for distributing software in App Stores and notarizing.
2. I'm not sure what ads you're talking about in macOS. I've only ever seen them in the completely optional App Store.
3. Installing Homebrew is literally a one liner. I've never used it, but Macports appears to be similarly easy as an alternative.
4. I can't speak to this point, so I'll take your word for it.
5. I only started using macOS since the Apple Silicon era, but as far as I'm concerned the experience just keeps getting better and better. Every release of macOS has added features I enjoy and use constantly. Just the seamless integration between all of the Apple products in my house was worth switching from my previous mix of Windows, Linux, and Android.
Edit: I am silly. Of course, people mean hooking it up to a bigger screen.
I was just looking at a mac book air yesterday but I just can't get over the complete ripoff of a memory upgrade from the base model.
16 gig starting at $599. I honestly don't need to know anything else to buy one.
(The only thing I do often that's CPU-limited is compiling, being faster at that saves me maybe a few minutes in a full working day; I don't care. I am frequently limited by RAM and I really hate shuffling things around to make space on drives.)
Can you elaborate? Thinking of setting up a MacMini for my kids but worried about lack of gaming options for them (I haven't gamed on a Mac in a dozen years and the state of gaming on MacOS was sad back then).
Here's a list of my most played games on my Mac in the last couple of years:
WoW, Hearthstone, Dota 2, League of Legends, Thronefall, Vampire Survivors, Baldur's Gate 3, Cult of the Lamb, Balatro, Death Must Die, Terraria, Dave the Diver, Mechabellum, Space Haven, Hades 2, Peglin, Stellaris, RimWorld, Dead Cells, Total War: Warhammer 2, Valheim, Civilization 6, Slay the Spire, Don't Starve Together, Cities: Skylines, Oxygen Not Included, SUPERHOT.
Games I play through GeForce Now:
Fortnite, Diablo 4, WoW, Apex Legends, Halo Infinite, Baldur's Gate 3, Cyberpunk 2077
The point of such an annoying long comment is to demonstrate that there is a very substantial Mac gaming library. The problem is that a new shiny game comes out that doesn't support Mac and you don't want to be the ONE guy in your group who can't play it because you're on Mac. The latest one for me is Deadlock. Not on GeForce Now, not on console, not on Mac... so I needed to get a Windows PC.
But if you're a kid and just looking for a general gaming machine, it plays a ton of cool stuff.
I noticed the other comments mentioned GeForceNow over _ethernet_. What connection speeds do you typically need to play these games over GeforceNow or similar.
nvidias cloud gaming offering. it works pretty well
The free tier is mostly crap (you only get to play if no paid users are using the capacity pretty much), but the paid tiers go from good to excellent.
Its main selling point is that you don't need to buy games for it separately, you can use your existing Steam catalog for example.
(Apologies if this seems like a stupid question. I've not played games for a very long time, mainly because most stuff doesn't seem to be available on Macs).
You can't do x86/x64 Windows on M-series Macs without emulation and it is generally a poor experience. There's a few things like Crossover, Parallels, etc that can help you run Windows games.
But I have found that most of the games I care about are either Mac native or on GeForce Now at this point. There's a surprisingly large game catalog on Mac now.
So the short answer is that some of them run on some sort of Windows compatibility layer, some are Mac native, some I stream. But most of my favorites run native on Mac.
To be honest, there are so many games to play these days that I don't mind missing out on a few titles. Valorant is a good example of a game that I can't play on Mac, GFN, or Crossover. But it's OK, I still have CS2.
10 core = 4 P and 6 E
12 core = 8 P and 4 E <-- 2.0x P core over base
14 core = 10 P and 4 E <-- 2.5x P core over base
EDIT:Updated with known P and E amounts.
Thanks HN for posting below.
https://www.apple.com/mac-mini/specs/
https://www.apple.com/newsroom/2024/10/apples-new-mac-mini-i...
M4 Pro 12-core = 8P, 4E
M4 Pro 14-core = 10P, 4E
I don't see it stated with the specific P vs E cores are for the 14-core version on:
> With up to 14 cores, including 10 performance cores and four efficiency cores
They've backtracked from the M3 Pro P:E ratio downgrade, which is a welcome surprise.
I’m inclined to trust Apple with this information but the skeptical side of me is questioning, how can we fact check this data? If it’s true it is very cool.
But ultimately its down to the third-party auditors they hire.
The fine print says:
> Carbon reductions are calculated against a business-as-usual baseline scenario: No use of clean electricity for manufacturing or product use, beyond what is already available on the latest modeled grid; Apple’s carbon intensity of key materials as of 2015; and Apple’s average mix of transportation modes by product line across three years. Learn more at apple.com/2030.
https://www.apple.com/2030 which mostly seems to focus on the goal of being 100% carbon neutral in energy use.
It sounds like they're generally only looking at carbon emissions from _energy_ use in transportation and manufacturing, and they're probably using some sort of carbon offset to achieve that "net zero". They're probably also not counting carbon emissions from building construction and they're probably not counting carbon emissions from meat served at corporate events, etc.
Update: I found a breakdown for the Mac Mini (linked from the apple.com/2030 page).
https://www.apple.com/environment/pdf/products/desktops/Mac_...
> 100 percent of manufacturing electricity is sourced from renewable energy
> For Mac mini, we are matching 100 percent of expected customer product use electricity with electricity from low-carbon sources.
They are counting transportation in the "100 percent", but are offsetting it with carbon credits.
"According to the World Gold Council, recycled gold accounted for 28 percent of the total global gold supply of 4,633 metric tons in 2020; 90 percent of that recycled gold comes from discarded jewelry and the rest from a growing mountain of electronic waste such as cellphones and laptops."
It's not entirely unreasonable to ask companies to be responsible for carbon capture or in the short term an offset for their employees breathing on the clock, as funny as that sounds.
We need to take all sources of carbon emissions seriously. This shouldn't be downvoted.
Unless you think their employees breathe more when they are on the clock than off it, I'm not sure this makes sense. When they're off the clock, they might be exercising or playing with their kids, so perhaps they actually breathe less when sitting at their desks on the clock.
Like someone else said, spending is a very good proxy for CO2 emissions, and about 68% of all spending is "consumer spending", which basically means keeping people alive, somewhat happy and somewhat productive.
I am thinking it may be better for cooperate to buy this and run Windows on VM than buying a PC.
Considering iPad and iPhone has been replacing 99% of my workflow outside of office I am thinking if my next computer could be a mini rather than a Laptop.
Last time I bought a Mac Mini was before the 2018 model got introduced, and I almost took it back in to get it exchanged (I was within 30 days of purchase when the 2018 model dropped), but it's been plugging away doing everything I have asked of it for 6 years, and it's still going strong. All the upgrades since have left me a little cool, but this genuinely looks like a contender for an upgrade. Only thing stopping me from getting the credit card ready is waiting to see what the M4 MacBook Air - which is inevitably going to be announced in the next 72 hours - looks like in comparison.
https://www.amazon.com/Sabrent-Mount-Under-Black-BK-MABM/dp/...
apple wanna 'show it off'
The only reason they might not is that they want to keep everything across the entire line, and the highest end Mac Studio probably needs more power than USB can offer.
The internal power supplies in Mac minis have been extremely reliable and the fewer cables the better, in my opinion.
That gives a lot more options IMHO on how to handle power for this machine, including portability, even if it's supposed to be a desktop machine.
I thought the same for the minisforum machines which would be competitive to this, they have a 19V input that really should be USB-C at this point.
The thing is, M chips run so efficiently that even with the power loss from conversion, it’s still pulling less power than a lot of PCs.
But yeah it running on a sleek dc battery would be a lot cooler
https://www.notebookcheck.net/fileadmin/_processed_/6/b/csm_...
Also just connecting the mini pc to a monitor with PD and not using any extra power brick at all seems like the much more relevant comparison.
If someone didn't know, macOS ignores fsync, so without UPS your data is not safe. Not an issue for laptops, obviously, but issue for battery-less devices.
FWIW while fsync() on Linux does request that the drive flushes its hardware cache, it's up to the drive whether to actually honor this request, and many don't do so synchronously (but still report success). So unless you control the whole setup end-to-end - hardware and software both - you don't actually have the guarantees.
It needs 100-240V, 50hz-60hz AC power.
As someone who lives in a very dusty 150 year old house, My Mac Studio does not appreciate the air input being directly on the desk. It collects all the dust that lands anywhere near it.
I have a large levoit air filter running 24/7 in my office and still end up with this[1] regularly. I wish I could at least reasonably take the thing apart to clean it out.
- Running an Air Filter 24/7 has huge diminishing returns (i.e. waste of electricity). They are best run at max fan speed for short durations instead.
- Elevate it with a platform.
- Get a vacuum (or even a robo vacuum). I grew up in a 100+ year old house, it wasn't dusty, and had hard-woods/brick everywhere.
Large buildings don't run their HVACs in burst and then turn them off.
I did this experiment in two locations. If I’m in the more urban area, running the air filter 24/7 was necessary.
Spigen LD202 Designed for Mac Studio Desktop Stand Mount with built-in Air Filter - Crystal Clear
https://www.amazon.com/Spigen-Designed-Desktop-FIlter-Crysta...
This one from the related products actually looks maybe a little more promising
However, I don't see how this leads to more dust going into the computer compared to e.g. front-facing ventilation.
The dust landing on the desk next to the computer will slowly drift down onto the surface, passing right in front of any opening and being sucked into the device anyway.
I agree that mine also gets dirty as well but nothing like your picture where it’s caked there.
I typically just wipe it clean after a couple of weeks. Can even go a month without any issues. I even have a dog that sheds like a mofo
Please take care of your health. Just saying as a fellow HN friend.
> The M4 chip brings a boost in performance to iMac. Featuring a more capable CPU with the world’s fastest CPU core,(4)
Then, deeper in the footnotes where no one ever reads
> (4) Testing was conducted by Apple in October 2024 using shipping competitive systems and select industry-standard benchmarks.
Basically, this is The Fastest CPU Ever* *we tested it, trust us.
How can anyone still give money to this company is a mystery to me.
* Kept HDMI
* New, much smaller form factor
* Front facing USB-C
* Base model has 16 gb of ram
I’d probably get 32GB. I started buying 16GB Macs in 2013. The extra RAM will keep any Mac useful for a few extra years. In fact, my 2013 Intel MB Pro would be still be great if I could upgrade the OS
M4: 21k lines / (core-second) https://browser.geekbench.com/v6/cpu/8495624
M2: 16k lines / (core-second) https://browser.geekbench.com/v6/cpu/8546977
And Apple has a long history of making this change ahead of the rest of the market. It's been years since they've move to all USB-C in their laptops, so IMO, it was only a matter of time.
And yeah - upgrades are awful price wise. From what I can tell, it's basically only worth it to buy base models unless the machine is making you money. Hopefully they upgrade the Mac Studio to M4 down the line.
I agree. My wife has a MacBook that is USB-C only, and it turns ten years old in a couple of months.
Honestly kind of want one as a desktop, even though my M1 Pro MBP is still insanely powerful for my needs.
"And with the industry-leading reliability of macOS, healthcare systems can count on mini when providing critical care."
A bit out of character, and also – what?!
Hats off! I didn't expect the Mac to be next in line for the carbon neutral goals. But they did it!
This looks to me like one instance where the incentives are decently working, at least to some point.
https://www.apple.com/environment/pdf/products/desktops/Mac_...
> Only after these efforts do we cover residual emissions through high-quality carbon credits that are real, additional, measurable, quantified, and have systems in place to avoid double-counting and ensure permanence.
Better than nothing...
Also interesting:
Maxed out: Mac mini with M4 Pro (64GB memory, 8TB SSD): Product footprint before carbon credits 121 kg CO2e
Min spec: Mac mini with M4 (16GB memory, 256GB SSD): Product footprint before carbon credits 32 kg CO2e
I wouldn't have thought that there is this much of a difference in electronics!
[1] enclosure: https://www.amazon.com/gp/product/B0BB74BQVN/
The disk I put in there is a SK Hynix Gold P31 2TB, I am not getting its full speed with this enclosure so you can probably get a slower one and get the same results.
It's always been a slightly clunky experience - having to eject them before I can undock my laptop, or the way they never go to sleep (some issue with CalDigit TB dock...?)
I used to think of them as a backup, but since moving house a couple of years ago my internet is fast enough to make Backblaze viable
Next time I upgrade I'm just going to have less boxes on the desk, less power-drawing crap plugged in all the time
I hate the price of 8TB storage on these though :(
[1] enclosure: https://www.amazon.com/gp/product/B0BB74BQVN/
Does anybody have a guide or tips on how to make one of these better for hosting a website with cloudflare tunnel and being resilient to power outages?
Is there a solution to log in to the OS GUI over wifi (like from an ipad or mac) if I need to use it as a computer? It won’t have a screen attached.
NoMachine runs alright on macOS.
I use Chrome Remote Desktop to get into the box remotely. If the box does end up losing power/restarting, I also make sure to have SSH on so I can ssh into the box and start remote desktop before being logged in (Google provides instructions).
I found this to be the path of least resistance to getting it remotely accessible.
The first version of OS X I used was Mavericks. In hindsight, that was the last great version of OS X for me — the last version where it seems the priorities of the people deciding the direction of development where somewhat aligned with mine.
Many have written about the decline in usability and attention to detail in OS X since then — I guess Apple Intelligence represents this shift in focus perfectly: a black-box interface that may or may not do something along the lines of what you were intending.
The published specs call out 3 6k screens but is that a display bandwidth limit or an arbitrary “screen” limit ?
I’d like to drive four displays and 4k is sufficient for me … possible? Perhaps with Number four on the HDMI port?
M4 (Thunderbolt 4): - Up to three displays: Two displays with up to 6K resolution at 60Hz over Thunderbolt and one display with up to 5K resolution at 60Hz over Thunderbolt or 4K resolution at 60Hz over HDMI - Up to two displays: One display with up to 5K resolution at 60Hz over Thunderbolt and one display with up to 8K resolution at 60Hz or 4K resolution at 240Hz over Thunderbolt or HDMI
M4 Pro (this one has Thunderbolt 5): - Up to three displays: Three displays with up to 6K resolution at 60Hz over Thunderbolt or HDMI - Up to two displays: One display with up to 6K resolution at 60Hz over Thunderbolt and one display with up to 8K resolution at 60Hz or 4K resolution at 240Hz over Thunderbolt or HDMI
There is really no reason you couldn't drive four (or more) lower resolution (4k) screens, given the array of ports.
In case anyone is wondering, the use-case here is a triple-monitor configuration at a desk with a much larger "TV" positioned, or hung, elsewhere in the room.
You can add as many extra displays as you want using DisplayLink which runs as a standard USB device and doesn't use the built-in controllers, but has worse performance, probably good enough for a "TV" though.
Back ports: 3 Thunderbolt 4 ports (Thunderbolt 5 on the top $1399 tier), HDMI, Gigabit Ethernet
RAM can be upgraded to 32GB on M4, to 64GB on M4 Pro
10 GbE looks selectable on any of these, +$100
They're comparing three generations back now?
Oh, I see that they never updated the Mini for M3. So it's only two generations of Mini. Still, I prefer to see one generation comparisons. And it's kind of weird that Apple doesn't keep their smaller product lines more up to date. They certainly have the resources to do so.
My guess is the leap from intel to M1 was significant for an upgrade and M1 vs M2/M3 wasn’t really. I’m personally on an M1 and use it heavily but I don’t think I need the M4 jump still.
The mini hardware is appealing to me though.
Though honestly my M1 Pro MBP from 2021 still performs so incredibly well I have no desire to upgrade anytime soon. Best computer I've ever purchased.
Does this mess up datacenters using Mac Minis in racks now?
This guy will probably have a lot of clients https://www.youtube.com/watch?v=E3N-z-Y8cuw.
Normal computers with NVMe storage will always be more repairable than Apple's hardware with everything soldered on the board.
Is that new? I ran an iMac with a dead internal drive off of an external Thunderbolt drive for several years.
Having a dead SSD seems to kill these computers. That's expected for something with soldered flash chips.
(4 got just minor damages, miraculously)
1: https://www.theguardian.com/world/2024/oct/29/tram-derails-a...
This is such an Apple stat especially for a game. What does "faster gaming performance" even mean? Every zone and city hub loads 13.3x faster so loading screens are quicker? They don't say anything about FPS and no one would use "faster" as a synonym for higher FPS.
An MMO is really not the best benchmark tbh
Edit: notes has the compared spec "Results are compared to previous-generation 3.2GHz 6-core Intel Core i7-based Mac mini systems with Intel Iris UHD Graphics 630, 64GB of RAM, and 2TB SSD."
So they compared the 2024 M4 to a 2018 8th gen Intel i7 (i7-8700B). Take that as you will
https://youtu.be/eaB7nCdId0Y?t=364
https://www.theverge.com/2024/10/28/24281965/honestly-this-i...
For example: https://support.benchmarks.ul.com/support/solutions/articles...
I wish Apple devices were more upgradable (and cheaper and more fixable), but I would speculate that Apple devices are the last devices to end up in a landfill (or more aptly, recycled). If you outgrow a device there is a very robust resale market and that machine will happily fill someone else's needs.
Apple devices seem to stay in use for an eternity.
The vast majority of people who will buy this will be just fine with that level of performance for many years to come.
Have you installed a server CPU?
It’s really easy to fuckup and lose a few channels of memory due to the contact being bad. Right now I’ve got a 3647 Xeon phi cpu that’s refusing to train dimm a1 for _reasons_
That’s not an experience Apple wants any user of their products to have.
Here’s an example BGA socket: https://www.ironwoodelectronics.com/products/bga-sockets/
Not something that’s going in a tiny laptop chassis.
Or you could get a m3 max, run the memory at twice the speed, still have a 512 bit wide memory bus, and have a 10+ hour battery life. Presumably similar with the m4 max, rumors claim later today (Wednesday).
How much do you want that socket?
Yeah, just take a look at PCIe 5 and it's 512GB/s of bandwidth.
-> Have you installed a server CPU?
Yeah, and none of the problems you mentioned.
-> That’s not an experience Apple wants any user of their products to have.
Yeah, just look at the older macs with upgradable components and the easyness you had replacing them... So, instead of making it easier, let's just remove it altogether.
Apple knows how to make money, I can buy a quality 4TB Nvme for 300$( you can definitely go lower if you want to risk it ). The upgrade to 4TB on the M4 Pro Mini is 1200$(it's not supported on the base model) , on top of 1400$ for the actual computer.
It I had to guess, most of Apple's margin is on users riding the pricing ladder up into the stratosphere.
I had an experience a few years ago at an Apple store, where this clerk refuse to sell me the cheapest m1 MacBook Air. There's probably some direction from up top which is trying convince people they need the more expensive Macs.
All Macs that I know of let you configure the boot drive. I had an older Mac Mini with a spinning HDD. I added an external SSD, set that up as the boot drive, and never touched the slow drive again. I'd be extremely surprised if you couldn't do the same with this.
https://support.apple.com/guide/security/boot-process-secac7...
If it does, you can get the SSD chips replaced. That is well proven now. Granted it needs a specialist with rework kit but they are starting to become more common now that it's an issue.
The memory chip is embedded in the SoC, how do you envision a way to do replacement of memory chips with this design/architecture?
https://www.anandtech.com/show/21390/micron-ships-crucialbra...
Even if Apple wanted to support modular memory, which they obviously don't, the ultra-tiny form factor of the new Mini would probably still rule it out though. Soldering the memory down is still more compact.
It's basically a 12 year old PC shrunk into a tiny box and low power budget.
My point is that this size of device is already available with upgradability so the form factor isn't the issue. Apple is significantly better at engineering products than these random companies and they could surely have made this new Mac Mini upgradeable. I do understand why they wouldn't want to though!
That N100 is an Intel Atom, quite a bit of a downgrade from the M4 with the fastest CPU cores out there.
The pin density on a bga memory is like, 0.3mm for the type typically used by apple. That’s 200 0.3mm pins that have to line up and work at 4GHz and survive you dropping it 5 feet.
I actually think Apple's way of managing upgrades isn't as harsh as many people think.
The first thing to get to sustainability is to use less. If you don't need the hardware to make hardware easily upgradable, you simplify the hardware and use less of it. This is one of the reasons Apple do it.
Secondly, they're using a lot of recycled material in this thing. Their lede line on it is that its carbon neutral. Show me another desktop PC like this that can make that claim.
Thirdly, the "half-life" of a Mac is kind of insane. When I was buying Thinkpads, Dells, and the like, I'd get 2-3 years down the line and I'd "need" to upgrade the whole thing. I've got a 2017 Mac Mini, and an 2015 MBP in regular use. I have a G4 iBook that was in active use by my parents from 2004 until _this Spring_ - they only gave it up because they couldn't upgrade Chrome on it any more, so it's about to become a retro Linux term for me, because the hardware is still sound (albeit too under-powered for anything modern).
And lastly, they take old hardware in and recycle it back into the new stuff in the first step. They give relatively decent trade-in prices, and are one of the few consumer brands doing that.
Given that they're shipping it with 16GB of RAM, which is fine for my needs, I think I'm confident in saying I could buy one, use it for 5-8 years, and then get it recycled when I upgrade at that point, while most PCs with upgradable RAM being sold today are going to landfill within 4 years, perhaps.
I'm trying to decide if I should get the Pro or the base model mini. I've been learning Swift and Metal using an old work Macbook and I want to get my own hardware. The only games I play recently at Factorio and Baldur's Gate 3, so I was thinking perhaps I should get the Pro and not bother upgrading my desktop (an i7 6700k from 2015).
Up to three displays: Two displays with up to 6K resolution at 60Hz over Thunderbolt and one display with up to 5K resolution at 60Hz over Thunderbolt or 4K resolution at 60Hz over HDMI
Up to two displays: One display with up to 5K resolution at 60Hz over Thunderbolt and one display with up to 8K resolution at 60Hz or 4K resolution at 240Hz over Thunderbolt or HDMI
Are these set in stone? Would it be enough to run say, two external 2560 x 1440 at 144hz?And if this is cross connected with TB4 networking and using exolab, might be good for a nice local setup.
Anyone up to try this out?
https://aws.amazon.com/ec2/instance-types/mac/
Amazon will have to accommodate the new form factor. They've already had to accommodate the previous mini and the Studio.
https://www.macrumors.com/2024/10/28/apple-promises-two-more...
Edit: also I noticed they moved the power button to the bottom corner of the Mac Mini! (It used to be on the back as well.) This makes me think even more that they didn't want to crowd up the back too much.
i'd guess.
if you use headphones all the time, you'd plug them into the back, or the monitor etc.
And to further muddy the waters, the space in the 7 chassis for the jack was mostly still available, which led to that one madlad bodging in his own headphone jack, for a one of a kind iPhone 7:
https://hackaday.com/2017/09/07/bringing-back-the-iphone7-he...
But yeah the headphone jack dropping was obviously just to get more people onboard with AirPods that launched at the same time. And you can't say it didn't work! I remember when the first images of people wearing AirPods came out and it was the laughing stock of the internet. People said it looked like you had Q-tips hanging out of your ears, or the tips of an electric toothbrush.
A few years later and they're pulling in tens of billions of dollars per year, just on AirPods sales alone. AirPods could be pulled out into its own business and it would be seen as a wildly successful tech company.
You may still need an amp for electrically incompatible (high impedance) headphones.
But yeah, anyone "serious" would go discrete for all that stuff regardless. I guess this also lets Apple sidestep a bunch of fuss around non-stereo use-cases, for people who want quadraphonic or 5.1 at their workstation.
Would have been nice to have audio both in the front and rear, with front audio overriding rear audio (like in most desktops), but I guess that would have been too much maximalism for apple
C'monnnn. Give us custom colors, like you already do for the iMac.
Otherwise seems like a fine machine for those who want UNIX and energy efficiency.
I really want to see what they're going to do with the Studio this/next year. M4 Ultra could be insane.
+ form factor
+ ARM/Apple silicon/SoC
Negatives:
- Apple tax on memory, storage
- non-upgradeable
- Apple tax on 1GbE to 10 GbE ($100 surcharge lol)
- maxes out at 64G of configurable memory w/ m4 pro
Got to give it to Apple. The traunches between different configuration levels is “small” enough to convince buyers to enter the next level.
It’s like “hey, you are already at $4000 for m4 pro with 64g. Just spend an extra $400-$600 for that bump in storage. It’s no biggy. We losing money at this point”.
I wonder how many consumers fall into this sunk cost fallacy scenario that Apple has designed.
A PCIe 10GbE NIC using the same Aquantia chipset as existing Macs is $80-$90. https://www.amazon.com/ASUS-XG-C100C-Network-Adapter-Single/...
edit: in fact, compared to Apple's usual price gouging for extra RAM/SSD/etc it is downright reasonable.
He has done this for nearly every product line (iPhone, Mac Mini, iPads, …)
* Mac Mini storage upgrade https://youtu.be/ApAffuPAl5A
Almost all of them, and the lowest tiers of the entire lineup are essentially ewaste in a box just there to push a few hundred $ on you.
For those who need more performance, better value is found at higher rungs of the product lineup, and this has been Apple's strategy for decades.
Millions. Let that sink in.
Just ordered the base M4 Pro so will just plug that into it and carry on.
https://press.asus.com/news/press-releases/asus-proart-5k-pa...
...to be looking really appealing to pair with one of these new Mac Minis.
$899 MSRP in the US, 5120 x 2880, same dimensions as an Apple Studio Display but a lot cheaper... And B&H just got them in stock.
Just ordered one myself, now I need to pick which variant of the Mac Mini M4 to pair it with. (My goal here is replacing a 27" Intel iMac for map making / CAD / DTP type stuff.)
M4 Air should be in the spring.
Even the iKeyboard I bought from them last year was lightning to USB-A and needed a dongle to connect to my Apple laptop.
At least it has an HDMI port.
I've been semi-lightweight gaming on my M1 MacBook Pro with some level of success...
Games that are optimised to run on Apple Silicon natively mostly run great (No Man's Sky easily pushes 120FPS at "almost-maxed-out" settings at native resolution and looks amazing on the built-in display).
Games running in rosetta also work well. The performance hit of Rosetta is only a couple percent.
Less demanding games such as Minecraft, Factorio, various MMOs, etc. will all run very well.
For Windows-only games, Parallels works shockingly well. I can run Skyrim with 3rd party shaders at 60 FPS without any issues in Parallels.
There is no 1st party support for SteamVR. You can supposedly get it working with some older Vive models. I couldn't get my Quest 2 working, even in Parallels. Some games with aggressive (rootkit) Anti-Cheats will probably also not work.
However, this is a fantastic general purpose machine for things like light web browsing, text editing, coding, etc.
Most games are still released for Windows + x86 (AMD/Intel).
Proton on Linux works wonders on AMD/Intel CPUs, but your best bet is still Windows.
Otherwise I'm unimpressed.
Okay so how many displays can the base model Mac mini m4 support? Is it one 5K over HDMI and then 2 6K over 2 separate Thunderbolt 4 connections, for a total of 3 displays?
https://asahilinux.org/fedora/#device-support
maybe m4 will come soon.
Of course, there are no guarantees that it runs correctly. Probably doesn't, given that even Apple and Microsoft's software don't run correctly, either. But saying software doesn't run perfectly in all cases is almost tautological.
[1] https://github.com/phoronix-test-suite/phoronix-test-suite
I wouldn't be so sure - if marcan loses interest (already looks like it), who is going to keep up with supporting the latest Apple chips?
When the M series chips were the hot new thing, there sure was developer interest - but now that a new chip is released every year, it becomes boring drudgery.
Look at support for T2 Macs - it took a decade to get them supported, not because the hardware was so different, but mainly because the hardware was 'boring'.
For some uses you won't get the best performance compared to native Linux. But for a Plex/Kodi server a VM should be great.
(On an x86 Apple laptop I found the power consumption better with a Linux VM on MacOS than with native Linux, so VMs can be quite efficient for some uses. Software builds sometimes run much faster in a Linux VM inside MacOS than natively in MacOS. On the other hand, I found Qemu inside a Linux VM for Android development was extremely slow.)
[1] https://social.treehouse.systems/@marcan/112277289414246878
i think maybe in the future, maybe they'll only have 1 chip line - but thats just a wild guess.
New iMac with M4 (apple.com) 509 points by tosh 1 day ago | flag | hide | past | favorite | 1058 comments
Developer that wants to run IntelliJ, a load of Docker containers, etc.? Sure, that's going to be tight, better get 32 or 64 GiB.
Remember that many Mac users are just folks that do web browsing, Office, and a bunch of other things and 16GiB is going to be enough for a few years.
I will say SVT-AV1 has had some significant ARM64 performance improvements lately (~300% YoY, with bitrate savings at a given preset[1][2], so call it a 400% increase), so for many use-cases software AV1 encoding (rather than hardware encoding) is likely the preferred modality.
The exceptions, IMO, are concurrant gaming with streaming (niche on MacOS?) and video server transcoding. However, even these exceptions are tenuous: Because Apple Silicon doesn't play x86's logical core / boost clock games, and considering the huge multi-threaded performance of M4, I think streaming with SW encoding of AV1 is quite feasible (for single streams) for both streaming and transcoding. x86 needs a dedicated AV1 encoder more-so due to the single-threaded perf hit from running a multi-threaded background workload. And the bit-rate efficiency will be much better from SW encoding.
That said, latency will suffer and I would still appreciate a HW AV1 encoder.
[0] https://en.wikipedia.org/wiki/Apple_M4 [1] https://www.phoronix.com/news/SVT-AV1-1.8-Released [2] https://www.phoronix.com/news/Intel-SVT-AV1-2.0
I wish Apple would invest in gaming, so that we won't have such a capable hardware allocating puny market-share of only 2% according to Steam survey. [2]
[1] https://gamesbymason.com/2023/08/21/way-of-rhea-linux/#way-o...
It's basically an Intel NUC, 12 years later.
edit to clarify: I don’t think reducing out carbon emissions is nonsense, that should be our top priority as a society, that’s also (part of) why AI is quite shit honestly, I couldn’t care less if they were just burning money by developing nothing burgers but they’re also burning through all our natural resources at insane rates.
However I do think the term “carbon neutral” is quite nonsensical and just seems like a term to make the consumer feel less guilty about themselves, hell sometimes it’s even used to make the company execs feel better about themselvss. I didn’t forget about that HORRIBLE, TERRIBLE “mother earth” commercial apple ran. DISGUSTING.
Almost nobody asked for this. I personally would have wanted one program to not stop starting with a cryptic message after upgrading to macOS 15.1 earlier today. But hey, crazies like me who want a decently working software are apparently not welcome in the customer base.
The only reason I am still staying with Apple for my desktop needs is that I paid $8000 for my iMac Pro and that was just some short 5 years ago.
But as time goes by, buying 1-2 specialized text rendering displays and going full Linux looks more and more attractive, especially with Fedora and Manjaro now offering the "immutable" distros i.e. you can frak around with your environment but then revert everything if you don't like it (or the contrary, do a DB commit of sorts i.e. have your changes persist after reboot) -- those features make backing up entire workstations even easier.
Sprinkle an external ZFS server and the ability to just zfs send/recv entire disks with encryption and compression and I think just some 2-3 short years into the future I'll be laughing at Macs.
Apple keeps dropping the ball. iOS 18 lost all my tab groups in Safari as well. And Photos randomly chooses not to show pictures in the big Photos feed; you have to know which day they were saved to be able to see them.
/facepalm
Apple is now in decline, it can't be more obvious by these fairly outrageous bugs + the fact that they are now regular followers like everyone else and are jumping on the "AI" bandwagon.
I've been saying this since the butterfly-keyboard fiasco. At that point it was clear that professional users weren't the focus anymore, and I promptly left the ecosystem. I still have a iPhone (12 mini) that I only keep because it still works, but every time CarPlay tries to murder me in traffic, I get one step closer to just yeeting that phone into the sea.
iPhones have gotten so good they can barely come up with anything new to add year-over-year. Young people in the west are almost exclusively buying iPhones because people like them.
They basically own the tablet space, if you're looking for a tablet there's almost no reason to go with anything other than an iPad. Same for smart watches, they make the best selling watching in the world.
For the amount of devices they move, they're shockingly reliable and have a smoother customer support / coverage system than any other company I've had to deal with. That's why people keep coming back.
It's pretty bizarre to say they're in decline. The only area I can see active decline is how badly they let Spotify eat their lunch with music streaming when they used to basically own digital music distribution.
The 2019 16” was a step in the right direction. More ports, better cooling, better keyboard.
The M1 Pro/Max line up brought back HDMI, MagSafe, SD card slots, and are seriously fast, quiet, & cool. The M2 and M3 releases have been iterative performance improvements and haven’t made any stupid decisions.
Apple’s also invested development effort into useful tooling for developers like their virtualisation framework - this has made Docker on Mac vastly more pleasant for instance.
Software though? Not at all impressed. They still check every program you ever start and it's very noticeable by the performance.
Or they don't care about Intel Macs anymore, that's also a real possibility.
>I've been saying this since the butterfly-keyboard fiasco.
It doesn't look like it: https://www.google.com/search?client=safari&rls=en&q=aapl&ie...
You should include additional factors in your analysis.
If I was judging by the context I'd say it's just a good old cynical whinge which is OK but very subjective.
At one point sales drop below what executives find acceptable. It has already started for iPhones and has been like that for several years.
Nowadays though... I am stuck with a former workstation-grade hardware where Neovim needs FIVE SECONDS TO START because macOS is auditing each and every of its syscalls. I switch to my (now allegedly ancient) full-AMD laptop with 5500U CPU and the only thing that needs more than 0.5 secs to start is Firefox. I was not able to find one thing that did not react instantly. I am seriously considering just plugging my 35" gaming display to the Linux laptop and make that my main work setup.
And you are right -- pro users made Apple rich but are now undesired because they apparently demand too much ("Who needs stable software, bleeeerrrrgh! Am I right guys?"). Yeah, screw Apple. I am back on the hamster-wheel employment grind now, sadly, but once I stabilize a bit more I'll just move to 2 HiDPI display and assemble a future-proof Linux workstation to go wit them. Pretty sure that with periodical thermal paste maintenance it can easily last me 10 years and I'd only change it if there's something seriously tempting out there (about which I am very doubtful; the tech giants were only worried about becoming oligopolists and they care not about their users' needs).
Apple had its opportunities. They wasted them. Sure, many people will consider them the top for a while still and will keep buying, but their pricing policy has made it blindingly obvious that less people are buying and they are now doing their damnedest to compensate for this by either including less in the package, making the carton package itself cheaper, or just making all products except the base models outrageously overpriced. That's how they keep the profit margins. The curse of being a public-traded company and all that.
Those policies will work for them. For some time more. I wonder what happens after.
Your setup should not be taking many seconds to do those routine things. Last time I experienced something like that, I had a dying harddrive that was logging a continual series of read timeouts.
My Neovim is heavily customized (AstroNvim) but again, it starts instantly on a supposedly weaker CPU.
Any pointers on how do I find those read timeout events or any signs of dying hardware?
I don’t want to sound like “it works on my machine”, but what you’re describing is so far out of expectations that I’ve gotta wonder if it’s an error condition.
And yeah I'll do the usual checks, disk health included. Been putting it off for a while anyway.
For additional context, last weekend I went around to every old Intel Mac in a medical office to upgrade their OSes and some of the apps on them. None of them were speed demons, but they were all just proportionally slower than my much newer Mx Macs. Regular "small" apps still loaded quickly and were perfectly usable. This is in a busy office where any slowdowns that kept people from working at full speed would be fixed.
And just because it works for me doesn't automatically mean it's got to work for you, of course. I'm not going to doubt your experience. It's more that what you're describing is so very different than what I'm seeing that it feels like there's got to be something more at play here.
Apples have been improving repairability, their laptops have many replaceable parts, and generally last longer than PCs.
Compare new price / used for 4 years old price on apple vs any PC. In my experience used apples age well, are reliable, still get years of OS updates, and general age better than PCs.
and it even got the newest apple intelligence update for MacOS, which is nice
The mini seems like the perfect thing to have a mini version and a ... creative design, bring back the trash can!