About time. This also applies to their older models such as M2 and M3 laptops.
In the U.S., the MacBook Air lineup continues to start at $999, so there is no price increase associated with the boost in RAM.
The M2 macbook air now starts at $1000 for 16GB RAM and 256GB storage. Limited storage aside, that’s surprisingly competitive with most modern Windows laptops.
People don’t want the 8gb ram because they are all used to windows I bet it.
Or Linux Or MacOS Or any computer really
watch how operating systems will now be heavier accordingly.
Yes it’s described as being for “Apple intelligence” which I’m sure won’t be bloated nor hard to disable at all… sigh
It’s literally a toggle in the settings under apple intelligence
Finally the RAM on that thousand dollar machine is on par with my decade old T420!
Hello fellow 7-row-keyboard Thinkpad user! (I use a W520)
EDIT: btw it’s a bit older than a decade ago
Considering their industry-leader status, it’s about 5-7 years too late.
I dunno if I’d even consider them an industry leader, unless you break down their ubiquity by industry category (in which they lead graphic design and maybe video editing, iirc). They lead phone sales in the US by a lot, but their overall desktop share is still relatively small (<10%), and their global footprint is buoyed only by iOS (which is still below Windows and Android).
I would say they’re an innovator, and they push certain companies to innovate, but they don’t really lead by that many metrics.
I meant leading as in “if Apple does something, others will too”. That has been true for quite a long time now.
And by that definition, I agree
The M2 macbook air now starts at $1000 for 16GB RAM and 256GB storage. Limited storage aside, that’s surprisingly competitive with most modern Windows laptops.
What do you mean limited storage aside?
If we disregard the fact that it’s terrible value for money, it’s a good deal. No laptop sold in 2025 and costing over a grand, should have anything less than a terabyte.
But it has a apple logo and it browses facebook just fine.
insultingly tiny, unupgradeable storage aside, that’s surprisingly competitive with most modern Windows laptops
It’s not ideal, but you’re getting probably the best hardware in the market in return. The M series still dominates Windows CPUs, and the build quality on most $1000 laptops leaves a lot to be desired.
build quality on most $1000 laptops
You’re not kidding.
I have a couple of laptops from various vendors, and they’re all built like shit.
ASUS is especially eyerolly: the case is literally crumbling into pieces. Like seriously? You couldn’t have picked a material that’s not literally going to disintegrate in two years on a $1200 laptop?
Yeah, a lot of manufacturers are just bad. I knew people who had Dell and MSI laptops and those things feel like toys. Cheap plastic and very wobbly hinges. The only manufacturer I genuinely trust is Lenovo. My Legion is a bit thick but I can at least rest easy that it’s built well.
I saw someone’s Samsung laptop last year and the screen was wobbling all over the fucking place. I couldn’t believe what I was seeing. I commented on it, and the owner just gave me a blank look.
Lenovo is, outside of their really cheap consumer options - like, the $500-and-under options - are pretty solid.
But yeah build quality is one reason when I roll my eyes at the ‘haha stupid buying apple! apple tax! lol ripped off!’ crowd: I mean maybe, but as soon as you pick up a Macbook whatever it’s immediately obvious that you’re getting something for what you’re paying, and not some bendy flexy piece of plastic crap that will maybe physically survive the warranty period, but not much more.
The best? Debatable. You ever watch Louise on YouTube? He constantly rags on bad hardware design when repairing MacBooks lol.
There’s hardware performance and then there’s hardware repairability. He’s talking about the latter.
That’s what I’m talking about too. Hardware repairability.
and I’m saying that Simple was talking about hardware performance
You said the “latter” which refers to the last thing you mentioned which was reliability. You mean “former”, then.
by “he” i meant louis lol
The localllama people are feeling quite mixed about this, as they’re still charging through the nose for more RAM. Like, orders of magnitude more than the bigger ICs actually cost.
It’s kinda poetic. Apple wants to go all in on self-hosted AI now, yet their incredible RAM stinginess over the years is derailing that.
I do have a 64gb m1 MacBook Pro and man that thing screams at doing LLM AI. I use it to serve models locally throughout my house, while it otherwise still works as a fantastic computer (usually using about half the ram for llm usage). I still prefer a 4080 for image generation though.
Completely laughable. Literally had 16 GB of DDR3-1600 for my 2600K from 2011 that I handed down to a kid nephew for their first PC to tinker with. Hell, my local NAS has more than that…
We use windows PCs at work as software engineers now, but when I was training I used a MacBook Pro M1 with 16GB of RAM and that thing was incredibly performant.
I know it in vogue to shit in Apple, but they build the hardware and the software and they’re incredibly efficient at what they do and I don’t think I ever saw the beachball loading icon thing.
Now the prices they charge to upgrade the RAM is something I can get behind shitting on.
The chip and OS won’t do shit when your ram is saturated by electron apps taking 800MB each. Maybe MacOS behaves better under very high memory pressure than windows does, but it doesn’t mean it’s okay to rip off consumers. That whole 8GB on mac = 16GB on windows has been bullshit all along, and is mostly based on people looking at the task manager and seeing high ram usage on windows (which is a good thing)
Haha no MacOs is not better performing under very high memory pressure. Rip me working on a macbook air.
I have to make sure not to run too many things at once…
Now that I think of it yeah, my work mac simply shows a popup telling me to kill an app. It just doesn’t deal with high mem pressure lol
I used Windows, Mac and Linux in the past year.
It’s not Mac that’s fast, it’s Windows that sucks hard.
Same.
- Mac - Fast, user friendly, and UNIX based.
- Windows - Fast (I have a beast), bloated, stupid command prompt (“Add-Migration”, capital letters really.), wants to spy on me.
- Linux - Fast, a lot of work to get everything working as you would on Windows or Mac and I’m past those days, I just want to turn the thing on and play Factorio or Minecraft, not figure out if my 4080 will run on it etc.
it’s almost like people make choices to suit their needs and there isn’t a single solution for everybody.
I wonder what the industry standard is for developers? Genuinely. I’ve heard it’s Max, but my company is all in on Microsoft, not really heard of companies developing on Linux. Which isn’t to say Linux doesn’t have its place, but I’m aware this place is insanely biased towards Linux.
My current Linux machine needed exactly zero config post install, and even stuff like the fingerprint reader is working, I’m using it instead of passwords in a terminal.
I can also play games pretty well, it’s usually smoother and less buggy than on Windows.
I feel Linux is not a compromise for me anymore, Windows is fast becoming one though.
What distro would you recommend, I’m prepared to try over the weekend.
How does it work with GPU drivers for a GeForce RTX 4080?
Anything else I need to be aware of
I just want to turn the thing on and play Factorio or Minecraft, not figure out if my 4080 will run on it etc.
Funny that you chose two games that run natively on Linux.
Minecraft runs great, I dont know about factorio.
but I know some native versions suck absolute ass and force you to use the windows version via proton regardless. ETS/ATS and Cities Skylines 1 being my immediate personal examples.
Every place I’ve been at had developers using windows machines and then ssh into a linux environment
Makes sense for sysadmin or something but little sense for developers and engineers writing code to build enterprise software.
As a developer writing code who used windows to ssh to linux servers I would disagree. But of course it depends on the company and the nature of the work, just offering my experience
What are you writing code for?
I literally can’t think of an example where ssh’ing into a terminal is going to give good workflow. Just using Nano or Vi?
Like no IDE.
I know it’s in vogue to shit on Apple…
Apple does have a lot of vertical integration which allows first party stuff to function well and they work closely with a lot of their premium 3rd party software partners, but you try running an actual RAM hungry process like a local LLM model, for example, and all but the highest end latest edition MacBook Pro WILL shit the bed.
You can use Linux with RAM compression to have the same kind of economy that MacOS does.
Just nobody bothers.
Fucking PHONES had more RAM. It was so fucking stupid. And despite their arguments, it was proven time and time again 8GB was not enough.
“640k is enough for anyone.”
Now that 64GB is the standard
Where? Workstations at best.
Google Chrome /s
Gaming at home
Perfect, just when I’ve decided 16GB is the bare minimum these days too. My day to day I max out 16 on my laptops without even trying. 32 is my new minimum.
And here I thought that 8GB on Mac was at least as good as 16GB on plebian PCs.
Their sales figures seem to show that the majority of people don’t care. For my needs when I’m using my MacBook, I’m one of those people who don’t care. That’s probably because it’s not my main PC, so I use it for the things most people probably use it for (browsing, watching media, some light work).
My daily driver MacBook Pro has 8GB of RAM, and so far, that’s been perfectly sufficient for my needs. Some might argue that 8GB is inadequate for a 1,700€ device, but I don’t think most people would notice a difference. This focus on specs might make more sense with computers, but with smartphones especially, I never understood the obsession with performance. My mid-range Samsung handles everything instantly - I can’t think of a reason it would need to be any faster. Numbers on a paper seem irrelevant when it doesn’t translate to everyday use.
MacOS, no matter what anyone says, has extremely efficient memory management. It’s seriously impressive how efficient that OS truly is, and it’s no surprise they stuck with 8GB for so long. The thing these clickbait articles don’t really bring to light is that the 16GB increase is really for Apple intelligence. If that wasn’t a thing these Macs would stick to 8GB.
It is “efficient” because they just dump everything on swap. If I cold boot my M1 air, it’ll be using 7GB of RAM and 4GB of swap without anything running in the background. I have this ongoing bug as well where some background apps will stop responding and the system can’t stop the process, so it starts a new one and it keeps doing this until I either stop the app manually, or my storage is completely full because swap is taking 80GB of my internal storage.
Your MacBook is a cell phone?! Hahaha jk
With no programs running my mac mini is using 16gb so I’m not surprised
That’s normal for these computers. Idea being it doesn’t really benefit you to have a ton of empty ram sitting around waiting to be used. So the OS makes no effort to clear it out until the space is needed.
If you believe their marketing it’s actually doing the opposite, and preemptively loading stuff into ram in order to make your common tasks feel as snappy as possible. But yeah either way you’ll notice the memory is always “full”, but you never seem run out
Well that would be good, but it goes completely against how i’ve learned to manage my machine these past three decades.
Yeah it was a trip for me as well to adapt to the new ways. For example it took me a long long time to adjust to allowing the computer to manage the multitasking for me. I would habitually always close out programs I wasn’t using, because I felt deeply from my decades of experience that running tons of things at once would cause many issues.
I was very uncomfortable letting all these “active” programs pile up, but it really turned out to be all good. The computers are designed to be used this way. And really, I’m better off for it, not having to go in and micromanage everything constantly.
What I’m trying to say is that learning is not something that is ever finished, you know? There came a day when we stopped defragmenting our hard drives, and now the day has arrived where the computer utilizes all the ram all the time
Interesting, I didn’t know that. Is that controlled by the operating system or something else? I’m curious about whether my Debian laptop does the same.