Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

The hardware is great, but the software is lacking. macOS only supports resolution-based scaling which makes anything but the default 200% pixel scaling mode look bad. For example, with a 27" 4K display many users will want to use 150% or 175% scaling to get enough real estate, but the image will look blurry because macOS renders at a higher resolution and then downscales to the 4K resolution of the screen.

Both Windows and Linux (Wayland) support scaling the UI itself, and with their support for sub-pixel anti-aliasing (that macOS also lacks) this makes text look a lot more crisp.



I would love to see examples of this. I have a MBP and a 24" 4K Dell monitor connected via HDMI. I use all kinds of scaled resolutions and I've never noticed anything being jagged or blurry.

Meanwhile in Linux the scaling is generally good, but occasionally I'll run into some UI element that doesn't scale properly, or some application that has a tiny mouse cursor.

And then Windows has serious problems with old apps - blurry as hell with a high DPI display.

Subpixel antialiasing isn't something I miss on macOS because it seems pointless at these resolutions [0]. And I don't think it would work with OLED anyway because the subpixels are arranged differently than a typical conventional LCD.

[0] I remember being excited by ClearType on Windows back in the day, and I did notice a difference. But there's no way I'd be able to discern it on a high DPI display; the conventional antialiasing macOS does is enough.


I'm more surprised that you're using a 24" display at any resolution. Of course, everyone has different preferences, but that just seems ridiculously small considering how available larger displays are for the same ppi and refresh rate probably.

I'm personally on the old 30" 16:10 2560x1600 form factor, and it's wildly better visually than the 27" 1440p screen by the same brand (all of them Dell) I use at the office.


> I'm more surprised that you're using a 24" display at any resolution

I have an 24" 4K Dell I bought when big 4k screen with good (measured) colors were still expensive. It's a very pleasant screen to use. Sure, it has less real estate than a bigger one, but this is somewhat mitigated by the fact that I can keep it closer to my eyes, so I can use smaller text.

I find it makes me more "focused" in a way. Can't have multiple windowfuls of crap visible at the same time. It's very practical for TWMs. It also works well in a dual screen scenario, for stronger separation when you need it, but I'm still not sure if a single bigger screen is better than two smaller ones for things like having docs up next to code for example.

I find I can't use two 27" or higher screens, they're just too big and I need to turn my head way too much for comfort. At work we have a 2x27" 4k setup, and I basically only use the screen in front of me. Later I've been experimenting with pushing them very far away, but then I just need to increase text size and lose actual real estate.

> but that just seems ridiculously small considering how available larger displays are for the same ppi and refresh rate probably

I don't particularly care about refresh rates above 60 Hz (my laptop does 120 Hz, can see the difference, don't care). But I do care about PPI. Which screens are easily available with the PPI of a 4K 24"? I'd expect something like 5k 27" or 6k 32". These are very expensive (>1000 € for a crappy 27" Samsung, 2000 for a 32" Dell) and not that common, at least in France.


> I don't particularly care about refresh rates above 60 Hz (my laptop does 120 Hz, can see the difference, don't care). But I do care about PPI.

I feel basically the same way, and I don't like excessively wide screens or even 16:9. I've always preferred 16:10, and have wavered between 1,2,3 screens over time. 16:9 27" 1440p is not a pleasant form factor, but it's fine in vertical mode.

I tend to prefer PPI, but not at the cost of screen real estate, and I tend to prefer 120hz, but not at the cost of PPI or picture quality. So the Dell Ultrasharp 30" series from years ago, with IPS 60hz and 2560x1600 is perfect for now, and it also lets me run games without investing substantially in brand new gaming PC hardware. The picture quality is great, the price on the used market is great, screen real estate is great, it's just not as sharp or fast as my Mac screen.

I've got my eyes on 32" 6k displays, but since they're so ungodly expensive, I'd really prefer them to have 120hz and good HDR, even though they're not priority attributes for me. I'd keep one of the 30" displays next to it in vertical mode for documentation or log files


> I'm personally on the old 30" 16:10 2560x1600 form factor

I sorta wish that form factor had taken off instead of 27" 1440p. The extra vertical space is really nice, and that seems to be the ideal PPI for 100% scaling IMHO.

I keep telling myself I'd like to get a 4K OLED display at the same PPI, but 40" seems to be conspicuously missing in every monitor lineup... at least at a price that will convince me to buy three of them, anyway.


Agreed. I'm hoping that some more decent 6k 32" screens come out this year, but they're still all 16:9 which just sucks imo


Agree! I still have several (now discontinued) Philips 40 inch monitors, and that is the perfect size to do programming work. Very little scrolling needed while you work. But I would love to have a 40 inch in 4K+ instead of 2560x1600, why is no one making these? (I did get a Samsung 8K 50 inch, but that's too large for a multi screen setup)


Any other requirements? I noticed this one recently, but 40" is a bit big for my taste: https://www.dell.com/en-ca/shop/dell-ultrasharp-40-curved-th...


Yeah, I worked on that one. It's passable, but I don't like the aspect ratio very much, it's too wide, I rather have 40" on 16:9


Ya idk what people are getting from ultrawides tbh. They're not great for video, not great for my neck, not enough vertical space, and can be disorienting for gaming. I can certainly imagine scenarios that would make them effective, but I'd just rather have more vertical space


I took one of my dual 24" office monitors during Covid WFH and ended up keeping it when I quit that job. I use it as a second display alongside the MacBook which is on a stand.

I think the largest I would want at my current desk is 27". 30 is way too big for me. But more importantly I want something that matches the crispness of the MBP display, and 1440p and 1600p are too low res.


Look at how many people only use their 14 inch laptop screen, it's ridiculous and terribly unergonomic.


This [1] has good examples. 24" 4K is on the smaller side and so less noticeable than on larger displays like 27" or 32".

[1] https://bjango.com/articles/macexternaldisplays2/


I have a Macbook pro and a Linux machine attached to my dual 4k monitors.

Fonts on Linux (KDE Plasma on Wayland) look noticeably sharper than the Mac. I don't use subpixel rendering either. I hate that I have to use the Mac for work.


This is correct and also increasingly affecting me as my eyes age. I had to give my Studio Display to my wife because my eyes can't focus at a reasonable distance anymore, and if I moved back further the text was too small to read. I ran the 5K Studio Display at 4K scaled for a bit but it was noticeably blurry.

This would've been easily solved with non-integer scaling, if Apple had implemented that.

(I now use a combo of 4K TV 48" from ~1.5-2 metres back as well as a 4K 27" screen from 1 m away, depending on which room I want to work in. Angular resolution works out similarly (115 pixels per degree).)


All through the 2000s Apple developed non-integer scaling support in various versions of MacOS X under the banner of “resolution independence” - the idea was to use vectors where possible rather than bitmaps so OS UI would look good at any resolution, including non-integer scaling factors.

Some indie Mac developers even started implementing support for it in anticipation of it being officially enabled. The code was present in 10.4 through 10.6 and possibly later, although not enabled by default. Apple gave up on the idea sadly and integer scaling is where we are.

Here’s a developer blog from 2006 playing with it:

> https://redsweater.com/blog/223/resolution-independent-fever

There was even documentation for getting ready to support resolution independence on Apple’s developer portal at one stage, but I sadly can’t find it today.

Here’s a news post from all the way back in 2004 discussing the in development feature in Mac OS tiger:

> https://forums.appleinsider.com/discussion/45544/mac-os-x-ti...

Lots of of folks (myself included!) in the Mac software world were really excited for it back then. It would have permitted you to scale the UI to totally arbitrary sizes while maintaining sharpness etc.


Yep, I played with User Interface Resolution app myself back then in uni. The impact of Apple's choice to skip non-integer scaling didn't hit me until a few years ago when my eyes started to fail...


> This is correct and also increasingly affecting me as my eyes age. I had to give my Studio Display to my wife because my eyes can't focus at a reasonable distance anymore, and if I moved back further the text was too small to read.

> (I now use a combo of 4K TV 48" from ~1.5-2 metres back as well as a 4K 27" screen from 1 m away, depending on which room I want to work in. Angular resolution works out similarly (115 pixels per degree).)

The TV is likely a healthier distance to keep your eyes focused on all day regardless, but were glasses not an option?


Glasses would have been the "normal person" fix, but my eyes are great otherwise (better than 20/20 distance vision). So I could focus closer with glasses, but the lenses were worse quality than just sitting farther back.


If you can get used to using it (which really just requires some practice), the screen magnifier on Mac is fantastic and most importantly it’s extremely low latency (by this I mean, it reacts pretty much instantly when you want to zoom in or out).

Once you get used to flicking in and out of zoom instead of leaning into the monitor it’s great.

As an aside, Windows and Linux share this property too nowadays. Using the screen magnifiers is equally pleasant on any of these OSes. I game on Linux these days and the magnifier there even works within games.


Oh man... I'm in the same situation wrt eyesight. Are you coding on the 4K tv? I have enough space to make that configuration work. TIA


Yep, 4K is plenty of resolution for me running Sequoia. But running at simulated 1920x1080@2x, as at native 4K text would be way too small.


Thank you!


> For example, with a 27" 4K display many users will want to use 150% or 175% scaling to get enough real estate, but the image will look blurry

I use a Mac with a monitor with these specs (a Dell of some kind, I don't know the model number off the top of my head), at 150% scaling, and it's not blurry at all.


I also feel it's just fine. Not as amazing as the Apple displays, but I'll have to sit really close to make out the difference for text.


I just tested on my 4k display and 150% and 175% were not blurry at all. I'm on a 32 inch 4k monitor. Is it possible this information is out of date and was fixed by more recent versions of macos?


Absolutely not fixed. Try to look on black text on white background. Its not very obvious but still a little annoying


Interesting, maybe it just doesn't bother me, because I do not notice it at all. I was looking at black text on a white background. Maybe it's less of an impact on Q-OLEDs with their pixel layout perhaps? I just checked and I actually run my ultra-wide monitor at 125% resolution and the text looks crisp. That one is a regular LED display but it does have really high pixel density (5120 x 2160, I run it at 3360x1418)


> For example, with a 27" 4K display

4K pixels is not enough at 27" for Retina scaling.

Apple uses 5K panels in their 27" displays for this reason.

There are several very good 27" 5K monitors on the market now around $700 to $800. Not as cheap as the 4K monitors but you have to pay for the pixel density.

There are also driver boards that let you convert 27" 5K iMacs into external monitors. I don't recommend this lightly because it's not an easy mod but it's within reason for the motivated Hacker News audience.


If your Mac goes bad it can be worthwile. My friend gave me his pre-Retina 27" iMac, part of the circa-2008 generation of Macs whose GPUs all failed.

I removed all the computing hardware but kept the Apple power supply, instead of using the cheapo one that came with the LCD driver board I bought. I was able to find the PWM specs for the panel, and installed a cheap PWM module with its own frequency & duty-cycle display to drive it and control brightness.

The result is my daily desktop monitor. Spent way too much time on it, but it works great!


Apple still uses ancient 450nm panel though, nowadays everyone and their dog moved to 455-460nm ones. 450nm considerably more harsh on my eyes.


Wayland supports it (and Chrome supports it very well) but GTK does not. I run my UI at 200% scaling because graphical Emacs uses GTK to draw text, and that text would be blurry if I ran at my preferred scaling factor of 150% or 175%.


GTK uses Pango/Harfbuzz and some other components to draw text, all of which are widely used in other Linux GUI stacks. GTK/GDK do not draw text themselves, so your complaints are not with them directly.


I'm not asseting that text is being rendered incorrectly. I'm asserting that after rendering, the text is being downsampled.


This works with GTK for me at least. I've been using Gnome+Wayland with 150% scaling for almost 4 years now, and I haven't noticed any issues with GTK. Actually, my experience is essentially backwards from yours—anything Electron/Chromium-based needed a bunch of command-line flags to work properly up until a few months ago, whereas GTK apps always just worked without any issues.


If you're using a high-DPI monitor, you might not notice the blurriness. I use a standard 110-DPI monitor (at 200% scaling in Gnome) and I notice it when the scaling factor is not an integer.

Or more precisely, I noticed it eventually as a result of my being primed to notice it after people on this site insisted that GTK cannot handle fractional scaling factors.

Compared to the contents of a browser's viewport, Emacs and the apps that come with Gnome are visually simple, so it took me a year or 2 to notice (even on a standard 110-DPI monitor used at 150% and 175% scaling) any blurriness in those apps since the app I'm most conditioned to notice blurriness is my browser, and Chrome's viewport is resolution independent except when rendering certain image formats -- text is always non-blurry.

Yes, Chrome's entire window can be quite blurry if Xwayland is involved, but it now talks to Wayland by default and for years before that could be configured to talk Wayland, so I don't consider that worth talking about. If Xwayland is not involved, the contents of Chrome's viewport is non-blurry at all scaling factors except for the PNGs, JPGs, etc. For a long time, when run at a fractional scaling factor under Gnome (and configured to talk Wayland) the only part of Hacker News that was blurry was the "Y" logo in the top left corner, then about 2 years ago, that logo's PNG file was replaced with an SVG file and the final bit of blurriness on HN went away.


> If you're using a high-DPI monitor [...] I use a standard 110-DPI monitor (at 200% scaling in Gnome)

FWIW, I'm using a 184 DPI monitor with 150% scaling.

> you might not notice the blurriness. [...]

> Compared to the contents of a browser's viewport, Emacs and the apps that come with Gnome are visually simple, so it took me a year or 2 to notice

I'm pretty sensitive to font rendering issues—to the point where I've complained to publishers about their PDFs having unhinted fonts—so I think that I would have noticed it, but if it's really as subtle as you say, then maybe I haven't.

I do have a somewhat unusual setup though: I'm currently using

  $ gsettings set org.gnome.mutter experimental-features "['scale-monitor-framebuffer','xwayland-native-scaling']"
although that might not be required any more with recent versions. I've also enabled full hinting and subpixel antialiasing with Gnome Tweaks, and I've set the following environment variables:

  MOZ_ENABLE_WAYLAND=1
  QT_QPA_PLATFORM=wayland
  GDK_BACKEND=wayland,x11,*
  CLUTTER_BACKEND=gdk,wayland
  SDL_VIDEODRIVER=wayland
  SDL_VIDEO_DRIVER=wayland
  ECORE_EVAS_ENGINE=wayland_egl
  ELM_ENGINE=wayland_egl
  QT_AUTO_SCREEN_SCALE_FACTOR=1
  QT_ENABLE_HIGHDPI_SCALING=1
So maybe one of those settings would improve things for you? I've randomly accumulated most of these settings over the years, so I unfortunately can't really explain what (if anything) any of them do.

> Yes, Chrome's entire window can be quite blurry if Xwayland is involved, but it now talks to Wayland by default

Ah, good to hear that that's finally the default; that probably means that I can safely remove my custom wrapper scripts that forced those flags on.


Do you notice blurriness on MacOS when the Settings app (name?) has been used to change the scaling factor to a fractional value?


Sorry, but I haven't ever used a Mac, so I unfortunately can't answer that. I've used Windows with fractional scaling, and most programs aren't blurry there, but the few that don't support fractional scaling are really blurry.


That's an accurate summary of my experience with Windows, too.


> macOS renders at a higher resolution and then downscales to the 4K resolution

That seems weird to me. I remember 20 years ago one of the whole points of macOS version 10 was display PDF, i.e. a vector based UI.


While the original OS X display model, Quartz, evolved from Display PDF via NextStep, I believe that it shifted back to pixel rasterization to offload more of the display stack onto the GPU.

Quartz Extreme?

John Siracusa, Ars Technica:

It's possible that existing consumer video cards could be coerced into doing efficient vector drawing in hardware. Apple tried to do just that in Tiger [note], but then had to back off at the last minute and disable the feature in the shipping version of the OS. It remains disabled to this day.

[note] https://arstechnica.com/reviews/os/macosx-10.4.ars/14

https://arstechnica.com/staff/2006/04/3720/


Have you ever seen a MacBook air's screen? Those use fractional scaling and look fine.


Yeah this is correct, I don't know why you're being downvoted. The decisions Apple made when pivoting their software stack to high-DPI resulted in Macs requiring ultra-dense displays for optimal results - that's a limitation of macOS, not an indictment of less dense displays, which Windows and Linux accommodate much better.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: