Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Apple said during WWDC that the change was made to support a wider variety of display technologies and scaling modes. It’s an odd choice considering they still sell the non-retina MacBook Air.


It's kind of a funny definition of support.


Forcing app developers to live with the same lowest-common-denominator aesthetics that most users see, will make them more likely to raise the bar for the aesthetics on lowest-common-denominator devices.

It's the same reason that video game developers shouldn't be allowed to have powerful GPUs. They need to be testing how the game looks (and works) for most people, not for a group of people only slightly larger than themselves.


I don't think most game shops do. Their devs typically are constantly comparing rendering on mid nVidia and AMD cards, as well as Intel integrated. The Indie titles have to do so because they know their games go for $9~$15 and they need to look consistent or at least decent or a lot of off-the-shelf laptops. Triple-A titles can afford larger huge testing teams that take on various Intel/amd/nvidia cards and report consistency issues.

Sure if the title has a huge AMD or nvidia logo when it starts, it's going to use a lot of specific functionality from that card, whatever those shops pay them specifically to advertise for. But devs need to ensure titles at least look okay and run at at least 60fps to get the largest sales audience.


Do you think most developers are using a non-retina screen? Compared to the general population I bet they skew very heavily towards newer, retina machines. Which means they'll barely notice anything. This is just plain lowering the bar, not raising the average.


I'd say most developers are docking with an external screen, and there are very few external screens that are retina. So yeah, I'd say most developers are using a non-retina screen.


> I'd say most developers are using a non-retina screen

I don't see this as a problem with 4K screens starting at 260 euros (~ US$ 306).


4k screens mostly run at 60hz, making them terrible for anything but watching movies. I don't like coding on a screen where I can see the refresh as I scroll.


Yes, that is currently a problem for those who want that 120 Hz goodness.


If you work as a dev in a company, they're not going to be issuing 4k screens.


You're kidding, right?

I get my devs whatever they want for hardware, within reason. $500 or $1000 monitors? No questions asked. The price of hardware that will last years is nothing compared to the value of engineer happiness and productivity, especially when compares to how costly good engineers are in the USA and Canada.


I know of many companies that issue their developers 4K screens, so…


So get your own monitor. I understand that many employees make the decision that they will not pay for equipment, to be true. However it's still a decision.


Why in the world would I want to use a 4K screen when I can get a 2560x1440 screen?


Hmm, I can't reply to Symbiote for some reason. But yes, 2560x1440 screens and other wide variants are really prefer by gamers or people who want high refresh rates. 4k is limited to 60Hz, unless you use DisplayPort 1.4 and some very new monitors.

There are quite a few users who prefer the wider format screens for a variety of reasons.


That's a silly question. More stuff fits in more pixels. Even if you have to compensate for readability by scaling fonts larger to make text the same size as it was on the lower dpi screen, the end result isn't actually equivalent to the lower dpi. It still works out that the higher dpi screen is on average more functional due to more things fitting on screen. Especially on dual/triple 27" external monitors on my desk. They are large and I am close to them, and a mere 1440 vertical pixels would be tolerable but crap when better is available. I usually have 4 to 10 terminal windows open where I want to see as much code and documentation and log output as possible, not a movie.


A 4K screen is 3840 × 2160, why would you not prefer that?


Because "retina" can imply pixel doubling, making it more like an extra-crisp 1920x1080. The typical 1440p screen is nice and big, and replacing it with such a setup at the same physical size would needlessly cut down on what you can fit. If you could trust non-integer scaling to work properly then 4K would resoundingly beat 1440, but that's a pipe dream right now.

You can get the best of both with a 5K-6K screen, but that's currently a very expensive price point not on most people's radar.


Because some computers can’t drive them > 30hz Most 4K monitors are too small to run native resolution

The rest are too large to run 2x

And those monitors have too low of a ppi to run scaled


These are cheap 27" screens though, which are not ideal for macOS, which is optimized for running at either 100% or 200% zoom. It's a shame that 4K screens at 24" and 5K screens at 27" are so hard to find. Apple uses these panels in their iMac line, but refuses to sell them as standalone displays.


I've scaled one to a virtual 3000-ish pixels, and some of my colleagues do so as well. It's not so bad.


Are they forcing app developers to not use retina screens?


Dark mode in Mojave points at OLED MacBooks. LCD subpixel antialiasing isn't working on OLED displays.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: