Hardware should lead. It’s easier to upgrade the software to make the hardware work, then it is to upgrade the hardware when the software decides to support it.
Hardware should lead. It’s easier to upgrade the software to make the hardware work, then it is to upgrade the hardware when the software decides to support it.
My exposure to Linux is pretty minimal, especially Linux with a GUI, so forgive my ignorance. Even reading over this thread I’m confused as to the issue here.
I don’t need an ELI5, but maybe someone can explain it like I don’t know what Wayland is?
My understanding is that an app should ask the system to display an object at X size, let’s say text at size 14. The system then works out that at the currently selected display resolution, size 14 will be Y pixels big. If needed, the system can scale that based on user preferences- a small, high DPI screen could render size 14 at only a couple of millimetres, for example.
Is the problem that devs are building things in a way that bypasses scaling? For example, hardcoding size 14 text to be Z pixels high?