etienne_marais
Honorary Master
- Joined
- Mar 16, 2008
- Messages
- 15,093
I have a Windows Forms application that renders some controls smaller if a low screen resolution is detected
(in order for everything to fit)
For my purposes a low screen resolution is a vertical resolution <= 768.
I am using System.Windows.Forms.Screen.PrimaryScreen.Bounds for the detection (fortunately all of the
end-users only have a single screen).
The screen detection works fine on older displays as well as HD monitors with various resolutions, but when
using an UHD monitor with resolution of 3200x1800 the application ironically detects a low resolution and it
is the only case where a low-screen-resolution-rendering looks wrong for my application.
The particular computer giving problems happens to be Windows 10 and I played around with the 'Graphic Properties'
(Maintain Display Scaling vs Maintain Aspect Ratio vs Scale Full Screen etc.) but no luck.
Any ideas ? (Or an explanation of what is happening as the screen 'looks' lowres)
(in order for everything to fit)
For my purposes a low screen resolution is a vertical resolution <= 768.
I am using System.Windows.Forms.Screen.PrimaryScreen.Bounds for the detection (fortunately all of the
end-users only have a single screen).
The screen detection works fine on older displays as well as HD monitors with various resolutions, but when
using an UHD monitor with resolution of 3200x1800 the application ironically detects a low resolution and it
is the only case where a low-screen-resolution-rendering looks wrong for my application.
The particular computer giving problems happens to be Windows 10 and I played around with the 'Graphic Properties'
(Maintain Display Scaling vs Maintain Aspect Ratio vs Scale Full Screen etc.) but no luck.
Any ideas ? (Or an explanation of what is happening as the screen 'looks' lowres)
Last edited: