Hacker News
3 hours ago by neilpanchal

I just got one. I’m blown away by the speed as well. Chrome runs insanely fast! Alas, it’s not developer ready yet. Brew is a mess. Docker doesn’t work. PyCharm is WIP although can use x86 version. I was skeptical of the hype but this little laptop has made me realize how slow everything else is.

Unfortunately, while the hardware has accelerated far beyond expectations, the software - specifically MacOS BigSur is a major step backward. So many fucking animations. Everything feels fluid like operating in molasses. The UI changes seem to be shoe horned into a desktop that doesn’t need giant white space for fat fingers. Menu bars are twice as tall taking up precious space. Top bar was already crammed with a lot of icons. Now, they’ve made them sparsely spaced by adding padding between the icons. Everything is baby-like with rounded corners and without borders. Segmentation UI elements are no more. I want to ask Apple’s UI team: WHY!? What is currently wrong with macOS Catalina UI? Until you can satisfactorily answer that, there shouldn’t be any change. Stop changing the UI like you’re working at Hermès. It’s not fashion. If the reason is to unify everything, all screen sizes, then you’re sacrificing all three. Perhaps making it easy to develop apps for all 3 platforms is a plus, but as a user, this all feels like a regression. I’ve lost hope in modern UI engineering. It’s not engineering anymore.

I want macOS that has a UI of Windows 95. That would be totally insane on Apple Silicon.

18 minutes ago by JumpCrisscross

> Stop changing the UI like you’re working at Hermès. It’s not fashion.

Of course it is. Our phones are intimately close to us. Physically, cognitively, socially and even emotionally. They may be the most widely-owned intimately-connected object humans have every invented outside religion.

Our computers don't occupy as close of a niche. But they're in a similar space.

I agree with your observation that the new OS feels like molasses. I wish they went for a "snappy" feel. (Though keyboard shortcuts get around that.) But ignoring that Macs and iPhones are objects of fashion as well as computing devices misses a deep part of what Jobs saw that technologists missed.

2 hours ago by krrrh

It’s pretty trivial to disable most animations (and more importantly transparency!). I’ve been doing that on new MacOS installs since Jaguar and it only takes a few minutes. If you want to move quickly you’re probably already using keyboard shortcuts and ignoring the dock and toolbars.

For less technically adept users (ie. most users) the animations and spacings mostly seem to help them understand what’s going on. I know everyone has their preferences, but I don’t really get the level of griping that accompanies every release.

12 minutes ago by lynndotpy

The animations on the iPhone are what made me return to Android.

On iPhones, you can only "reduce motion", which still has the "moving through molasses" feeling, replacing them with a fade in/out. Can you truly disable most animations in MacOS, or are they simply replaced with fade in/out animations?

2 hours ago by creddit

Brew is only slightly a mess, in my experience.

I've installed brew both in the historical /usr/local location as well as the future home of /opt/homebrew. I then created these two aliases:

  alias armbrew="/opt/homebrew/bin/brew"
  alias intbrew="arch -x86_64 /usr/local/bin/brew"

My PATH selects for programs installed in the /opt/homebrew location first and then /usr/local. I try to install with the ARM version first with `armbrew install -s <PKG>` and if it fails, I move to using the `intbrew` alias as normal. I haven't really had any issues.

It's obviously still messy but not in a way that is too bad!

23 minutes ago by rewtraw

Running iTerm/Terminal in Rosetta allows it to work without issue, it's the best solution until more packages work natively under ARM.

Duplicate iTerm.app, right click & "Get Info", check "Open in Rosetta". Now install homebrew how you would on an Intel Mac and everything will work.

an hour ago by apatheticonion

> I want macOS that has a UI of Windows 95. That would be totally insane on Apple Silicon.

So much this. I used to run Windows 7 in classic mode and really liked the low footprint, no nonsense appearance. Windows 10 has no such mode.

I wonder if we will see Linux support

an hour ago by spurgu
an hour ago by bawolff

That's not really on this topic. What linus wants to use personally has little bearing on what architectures linux will (eventually) support.

2 minutes ago by abvdasker

These reviews are promising, but I'd like to see more hard performance data before drawing any conclusions. Obviously these testimonials are anecdotal and could mostly be the result of a placebo effect after making such an expensive purchase. If benchmarks can demonstrate that the M1 is a stepwise improvement over x86 processors I may even buy one of these laptops.

2 hours ago by xoa

Not to speak for anyone else, but one thing I gently disagree with:

>Given that Hackintoshers are a particular bunch who don’t take kindly to the Apple-tax[...]

I have zero issues with an Apple premium or paying a lot for hardware. I think a major generator of interest in hackintoshes has been that there are significant segments of computing that Apple has simply completely (or nearly completely) given up on, including essentially any non-AIO desktop system above the Mini. At one point they had quite competitive PowerMacs and then Mac Pros covering the range of $2k all the way up to $10k+, and while sure there was some premium there was feature coverage, and they got regular yearly updates. They were "boring", but in the best way. There didn't need to be anything exciting about them. The prices did steadily inch upward, but far more critically sometime between 2010 and 2012 somebody at Apple decided the MP had to be exciting or something and created the Mac Cube 2, except this time to force it by eliminating the MP entirely. And it was complete shit, and to zero surprise never got a single update (since they totally fucked the power/thermal envelope, there was nowhere to go) and users completely lost the ability to make up for that. And then that was it, for 6 years. Then they did a kind of sort of ok update, but at a bad point given that Intel was collapsing, and forcing in some of their consumer design in ways that really hurt the value.

The hackintosh, particularly virtualized ones in my opinion (running macOS under ESXi deals with a ton of the regular problem spots), has helped fill that hole as frankenstein MP 2010s finally hit their limits. I'm sure Apple Silicon will be great for a range of systems, but it won't help in areas that Apple just organizationally doesn't care about/doesn't have the bandwidth for because that's not a technology problem. So I'm a bit pessimistic/whistful about that particular area, even though it'll be a long time before the axe completely falls on it. It'll be fantastic and it's exciting to see the return of more experimentation in silicon, but at the same time it was a nice dream for a decade or so to be able to freely take advantage of a range of hardware the PC market offered which filled holes Apple couldn't.

3 hours ago by brundolf

This is fascinating:

> Retain and release are tiny actions that almost all software, on all Apple platforms, does all the time. ….. The Apple Silicon system architecture is designed to make these operations as fast as possible. It’s not so much that Intel’s x86 architecture is a bad fit for Apple’s software frameworks, as that Apple Silicon is designed to be a bespoke fit for it …. retaining and releasing NSObjects is so common on MacOS (and iOS), that making it 5 times faster on Apple Silicon than on Intel has profound implications on everything from performance to battery life.

> Broadly speaking, this is a significant reason why M1 Macs are more efficient with less RAM than Intel Macs. This, in a nutshell, helps explain why iPhones run rings around even flagship Android phones, even though iPhones have significantly less RAM. iOS software uses reference counting for memory management, running on silicon optimized to make reference counting as efficient as possible; Android software uses garbage collection for memory management, a technique that requires more RAM to achieve equivalent performance.

an hour ago by darren_

This quote doesn’t really cover why M1 macs are more efficient with less ram than intel macs? You’ve got a memory budget, it’s likely broadly the same on both platforms, the speed at which your retains/releases happen isn’t going to be the issue. it’s not like intel macs use GC where m1 uses RC.

(It explains why iOS does better with less ram than android, but the quote is specifically claiming this as a reason for 8GB ram to be acceptable)

40 minutes ago by blihp

It's a contributing factor. If things like retain/release are fast and you have significantly more memory bandwidth and low latency to throw at the problem, you can get away without preloading and caching nearly as much. Take something simple like images on web pages: don't bother keeping hundreds (thousands?) of decompressed images in memory for all of the various open tabs. You can just decompress them on the fly as needed when a tab becomes active and then release them when it goes inactive and/or when the browser/system determines it needs to free up some memory.

22 minutes ago by jml7c5

Decompression is generally bound by CPU speed, not memory bandwidth or latency.

4 minutes ago by rodgerd

I can't speak to the MacOS system, but from years spent JVM tuning: you're in a constant battle finding the right balance of object creation/destruction (the former burning CPU, the latter creating garbage), keeping memory use down (more collection, which burns CPU and can create pauses and hence latency), or letting memory balloon (which can eat resource, and makes the memory sweeps worse when they finally happen).

Making it cheaper to create and destroy objects with hardware acceleration, and to do many small, low-cost reclaims without eating all your CPU would be a magical improvement to the JVM, because you could constrain memory use without blowing out CPU. From what's described in TFA it sounds like the same is true for modern MacOS programming.

an hour ago by uncomputation

Someone please correct me for the sake of all of us if I’m wrong, but it sounds like Apple is using specialized hardware for “NSObject” retain-and-release operations, which may bypass/reduce the impact on general RAM.

41 minutes ago by nemothekid

I saw that point brought up on Twitter and I don't know how it it makes more efficient use of RAM.

Specifically, as I understood it is that Apple software (written in objective C/Swift) uses a lot of retain/release (or Atomic Reference Counting) on top of manual memory, for memory management rather than other forms of garbage collection (such as those found in Java/C#), which gives Objective C programs a lower memory overhead (supposedly). This is why the iPhone ecosystem is able to run so much more snappier than the Android ecosystem.

That said, I don't see how that translates to lower memory usage than x86 programs. I think the supporting quotes he used for that point are completely orthogonal. I don't have an M1 mac, but I believe the same program running on both machines should use the same amount of memory.

34 minutes ago by torstenvl

I don't think that's quite right. Apple believes strongly in retain-and-release / ARC. It has designed its software that way; it has designed its M1 memory architecture that way. The harmony between those design considerations leads to efficiency: the software does things in the best way possible, given the memory architecture.

I'm not an EE expert and I haven't torn apart an M1, but Occams's Razor would suggest it's unlikely they made specialized hardware for NSObjects specifically. Other ARC systems on the same hardware would likely see similar benefits.

44 minutes ago by gameswithgo

it makes objective c and swift memory management faster but it doesn’t reduce ram usage at all. (maybe a weee bit less bandwidth used)

39 minutes ago by jolux

Their hardware is almost certainly specialized for reference counting, but I would be surprised if they had a custom instruction or anything.

an hour ago by gameswithgo

yeah its a bit of a stretch. to the extent that macos apps use garbage collection less than pc apps it would need less ram. but they are kinda hopping around a macos vs android comparison which makes no sense. I think mac enthusiasts trying to imagine why a max of 8 or 16gb is ok. it is ok for most people anyway.

33 minutes ago by jml7c5

I think the author doesn't understand what Gruber wrote here. Android uses more memory because most Android software is written to use more memory (relying on garbage collection). It has nothing to do with the chips. If you ran Android on an M1, it wouldn't magically need less RAM. And Photoshop compiled for x86 is going to use about the same amount of memory as Photoshop compiled for Apple silicon. Sure, if you rewrote Photoshop to use garbage collection everywhere then memory consumption would increase, but that has nothing to do with the chip.

6 minutes ago by Ar-Curunir

If your hardware enables more regular/efficient garbage collection, then it absolutely can lower memory consumption.

Given that the M1 chip was designed to better support reference counting, it makes sense that doing the same for HC could lead to a benefit

2 hours ago by lisper

Remember Lisp machines? The M1 is a Swift machine.

2 hours ago by brundolf

I'm wondering if the "optimized for reference-counting" thing applies to other languages too. i.e. if I write a piece of software in Rust, and I make use of Rc<>, will Macs be extra tolerant of that overhead? In theory it seems like the answer should be yes

2 hours ago by lilyball

I sure hope so. In macOS 10.15, the fast path for a retain on a (non-tagged-pointer) Obj-C object does a C11 relaxed atomic load followed by a C11 relaxed compare-and-exchange. This seems pretty standard for retain-release and I'd expect Rust's Rc<> to be doing something similar. It's possible Apple added some other black magic to the runtime in 10.16 (and they haven't released the 10.16 objc sources yet) but it's hard to imagine what they'd do that makes more sense than just optimizing for relaxed atomic operations.

2 hours ago by klelatti

But how? It has an Arm CPU - how does it differ from any other machine with a 64 bit Arm CPU?

2 hours ago by brundolf

The M1 is much more than a CPU, and a CPU is much more than an instruction set

2 hours ago by meragrin_

I doubt Apple allows anyone with the knowledge to speak about it.

2 hours ago by Karupan

> running on silicon optimized to make reference counting as efficient as possible

I'm curious to understand this. Is this because of a specific instruction set support, or just the overall unified memory architecture?

2 minutes ago by zarkov99

I am thinking about getting thing mostly to ssh into a linux server. I would like to run emacs on the server and have its display bounced back via X to the Mac. Is this practical? I tried Quartz on my wife's Mac but the fonts looked like crap.

8 minutes ago by koffiezet

I understand the machine is great or going to be great for most use cases. My mbp is my main workhorse, but as a freelance SRE "devops" guy, the Apple ARM platform won't be suitable for my job any time soon, if ever.

Docker is not yet available - but even when it would become available, emulating virtualised x86 code is explicitly not going to be supported. That in many cases means pulling a docker image built in a ci/cd pipeline where a dev screwed something up and debugging it locally is no longer an option. If I wasn't freelance, I could probably get away with some cloud instance to run all my docker stuff, but I'm dealing with too many different environments, for clients with various different legal requirements making this simply 'not an option'.

Too bad, because the machines look very promising for everything else. Development tools aren't there yet, but I expect that to be fixed pretty quickly.

13 minutes ago by emadabdulrahim

The only downside of the amazing new M1 MBP is that it runs WoW on max settings 60fps. And now I'm back into the world of Azeroth. Especially with the launch of Shadowlands.

What the hell, Apple, I thought I was safe and immune from video games with my MacBooks.

2 hours ago by nbzso

Those of us who are long enough in UI design know what is a result of attention to detail and professional GUI. We have all used Os X not only for UNIX like core (Darwin) but for consistent UX and UI libraries. In some point in time Apple was influencing our work in really meaningful way by setting the standard (remember Apple Human Interface Guidelines pre Yosemite). For me personally Soundtrack Pro is most polished professional interface ever made. So in this context UI “innovation” trough emoji and implementation of white space for touch interaction (without touch interaction) is funny but not usable. Performance aside ( which is big accomplishment ) I miss the old approach with balance of contrast and natural flow and will stay on Catalina as long as I can. If Apple changes their stance on telemetry, bypassing things and fixes UI/UX design I have no problem to join again. What is lacking in Linux desktop is consistent approach to UI, but for some of us may be is time to revaluate and relearn things. My personal time investment is in Emacs, with time I have more and more respect for those ideas of freedom and consistency. The selling point for me with Apple was professional interface and high UI standards, sadly they are gone. But hey everyone of us is different and this is good, right?

20 minutes ago by jolux

Oh hey another emacs user!

You can turn off all the telemetry in macOS and they ask you if you want it on when you setup the computer.

Agree to disagree on Big Sur, I love the new look. Keep in mind they’re calling it macOS 11, so there are probably bigger and less superficial changes down the road.

6 minutes ago by sneak

> You can turn off all the telemetry in macOS and they ask you if you want it on when you setup the computer.

That's false. You can turn off OS analytics but there is tons of telemetry built into almost every Apple app, separate from that, that you cannot disable. It tells you about it on first app launch. Open Maps, for example, and it will tell you about the unique, rotating identifier it uses to track your searches. Opting out of OS analytics does not disable telemetry for the other Apple services now deeply integrated in the OS. Even disabling these features doesn't prevent the mac from talking to the services, such as in the case of Siri.

Additionally gatekeeper OCSP checks on app launches serve as telemetry in practice, and this has no preference or setting to disable it.

an hour ago by grishka

It's all mostly redesign for the sake for redesign at this point. Desktop OSes had been feature-complete for quite some time, but they still have to update every year. They have to. Don't you even dare question that. I'm still on Mojave and it does everything I need from an OS. I also absolutely love native Mac apps, which are becoming rarer and rarer. And no, iOS apps that run on macOS aren't native mac apps. The abomination that is the mojave app store? That definitely took some extra talent to break every single UI guideline, but thankfully I only open it once a couple months.

22 minutes ago by nbzso

Just a thought: If someone in 2008 asked me -What desktop interfaces will be used in 2020? My answer may have been: Apple will implement a new Desktop paradigm on top of Raskin Zoomable UI Ideas (https://en.wikipedia.org/wiki/Jef_Raskin). But here we are: Monster SOC with Cartoon Network on top. :)

15 minutes ago by grishka

The thing with interfaces is that there's no inherent need for change if the method of interaction doesn't change. It was a non-touch screen, a keyboard, and a mouse/trackpad 20 years ago, and it still is today. Some things just work great. They're tried and true and battle-tested. Like, you know, densely packed windows that are optimized for the precision of the mouse pointer.

Daily Digest

Get a daily email with the the top stories from Hacker News. No spam, unsubscribe at any time.