Results 1 to 1 of 1

Thread: [VB6] DPI, DPI setting, Twips and DPI awareness

  1. #1

    Thread Starter
    PowerPoster
    Join Date
    Feb 2017
    Posts
    4,995

    [VB6] DPI, DPI setting, Twips and DPI awareness

    I decided to write this "tutorial" because I see that these things often cause confusions or are difficult to understand.
    I'll try to explain it to the best of my knowledge.

    The article is intended to be technical, not about history, but I discuss a bit of history because I think it is necessary to understand the subject.

    If you want to understand the subject please read at least the first two sections:
    - DPI, DPI setting and twips
    - DPI awareness

    The third section is:
    - How to set a program as "DPI aware" (to do).

    DPI, DPI setting and twips.

    DPI stands for dots per inch.
    It measures how many dots are in an inch.
    In printers, for example 300 DPI means 300 dots in a physical inch. 600 DPI are 600 dots per inch.
    If you draw a line of 2 inches at 600 DPI it will take 1200 dots. It is the printer resolution. The more DPI yields better quality and more detailed printing.

    In monitors "dots" are pixels. DPI in monitors means "pixels per inch".
    Resolution is how many pixels there are horizontally and vertically.
    Another concept related to the subject in monitors is "dot pitch".
    Dot pitch is the physical distance between two pixels.

    The dot pitch (the pixel size) directly relates to the physical DPI because it determines how many pixels there are in an inch.
    And the physical DPI along with the resolution determines the physical size of the monitor.

    Does Windows know the dot pitch of a monitor?

    No, or not always. Some monitors report their physical dimensions via EDID (a technology used for the monitor to report its capabilities to Windows) and Windows knows the resolution (for example 1920x1080). Other objects such as TV's and old monitors may not provide any information. So the DPI for a monitor, unlike printers, is estimated and cannot always be trusted as real physical DPI.

    With new LCD monitors and digital bidirectional link (HDMI), and the EDID technology Windows can obtain the real pixel pitch and then the actual physical DPI and then adjust the Windows DPI to the physical DPI.
    However for some reason MS doesn't want to do that and prefer to offer to the user some standard DPI settings like 100% (96 DPI), 125% (120 DPI) and so...
    Probably because many programs are not ready to adjust themselves to any arbitrary DPI number.

    Screen DPI settings in Windows is not new. Windows XP had the setting, and probably older Windows versions too.

    At the time of CRT monitors, there were not physical pixels in the monitor, so based on your monitor dimensions you could manually set the DPI.
    And changing the DPI setting to a higher value, and obviously not changing the monitor dimensions, had the effect of telling windows that it (and the programs) should use more "pixels" per inch, because the monitor allegedly had more pixels per inch (or DPI).

    You could also change the resolution, for example from 800x600 to 1024x768, and since CRT had not real physical pixels you could put whatever resolution you wanted.

    Both settings had the effect to make things bigger (or smaller) on the screen, but in a different way.

    Resolution is how many pixels there are horizontally and vertically, and DPI are how many pixels there are in an inch.

    Changing the resolution had an immediate effect for all programs because if you set more pixels, the desktop space gets bigger in pixels but on the screen everything gets smaller because the monitor physical dimensions didn't change.

    But, unlike resolution, changing the DPI setting didn't have an effect on all programs and some may not display visually correct. Why?

    Some programs are coded using all pixels. When the DPI changes (ie. the number of pixels in an inch changes), they don't do anything, the all pixel code does NOT recognize the DPI change, and the display remains exactly the same.
    And the cases of programs that did not display visually correct was (mainly) because they use pixels for some things and inches (Twips, or other DPI independent unit) for others things.

    Technically the idea is that if the user set a higher DPI is because it plugged a smaller screen, so if a program wants to show the elements in the same size, it should scale up the sizes in pixels.

    In practice, that setting was used more like zoom feature, to increase the size of the elements (including fonts, etc.).

    Besides pixels, VB6 has the option of inches, centimeters, millimeters and twips (listing only more relevant ones).

    Centimeters, inches and millimeters are all real life measurement units.
    But what about Twips?
    Twips also is the same, but smaller.
    1 centimeter: 1 inch/ 2.54
    1 millimeter: 1 inch/25.4
    1 Twips = 1 inch/1440

    1440 twips = 1 inch.

    VB6 works by default in twips, and twips are as inches, DPI independent units.
    As long as you keep things in twips, the program is ready to adjust to the DPI setting.

    On the other hand, if you work in pixels, your program is not ready to adjust itself to the DPI setting.

    VB6, working in twips, simplified the adjustment to DPI settings compared to languages that worked only in pixels.

    Conclusion: to adjust to the DPI setting, work in inches, centimeters, millimeters or twips.

    VB6 fonts are already adapted to scale correctly based on the DPI setting without having to do anything. On the other hand, Images (icons, whatever) won't scale automatically because they are in pixels.

    What happens if the forms and controls are in pixels?
    If you work at design time in pixels, the values are still converted and stored in twips in the *.frm file.
    So when the form is opened in another DPI setting, they are automatically converted to that setting.
    For example if you design at 96 DPI (100%) and set something to 100 pixels, save the project and then change the screen to 120 DPI (125%) and then open the project again, it now will be 125 pixels, not 100.
    That's because the sizes are converted and stored in twips, not in pixels.

    So far, good, it seems that you can also work in pixels with no problem.
    But the problems start when you set any dimension or position by code in pixels.
    If you set something to 100 pixels by code, it will be set to 100 pixels. And 100 pixels are not the same size in all DPI settings.

    Conclusion: don't work in pixels, unless you really want to keep that element in pixels no matter the DPI setting. And in that case do it always in code, do not mix design time pixels with pixel measures sets in code.

    The alternative if you prefer to work in pixels, is to set everything in code and with a modifier that is related to the DPI.

    for example 100 pixels * modifier:

    Code:
    Label1.Width = 100 * modif
    modif must be: 1 / 96 * DPISetting
    In 96 DPI (100%) it will be 1, in 120 DPI (125%) it will be 1.25... etc.

    But I repeat, I think it is better (and simpler) to work in twips and to make the conversion in the cases you really need pixels, for example when using APIs.

    Twips are always twips, at design tie, in code, in different DPI settings, and at the end twips are converted to pixels according to the current DPI setting.

    DPI awareness

    Before Vista there was not "DPI awareness" setting.
    Technically, everybody should had been aware that there was a DPI setting in Windows and that the programs had to take that setting into account and adapt to it.
    In practice, few programs did.

    Many programs were not prepared for different DPI settings because at the time most users were using just the default of 96 DPI, and most didn't know how to change that setting anyway (it was an advanced setting).
    Also because it takes some work to make the programs to adapt to the DPI setting, and perhaps many because the programmers didn't know about that setting at all (they were "not aware of it").

    With CRT monitors it was easy to make everything bigger: change the resolution. For example from 1024x768 to 800x600.
    CRT didn't have physical pixels so you could do that without any side effect.
    But with LCD monitors, that became popular at the time, if the screen resolution does not match the physical resolution, it does not display too well.
    So it was not advisable anymore to change the resolution in order to enlarge the elements on the screen.
    The alternative was to change the DPI setting.

    Even if it wasn't technically correct to change the DPI setting, because that was in its origin intended to be done if you plugged another monitor with a different dot pitch (pixel size), in practice it served for enlarging the elements in the same monitor, as long as the programs responded to the setting, I mean that they were programmed in inches and not in pixels, if the programmers were aware of that setting and did that work.
    And there wasn't another setting to touch anyway, there was no concept of "zoom" or scale in windows.

    Also users getting older, they needed to see things bigger, so more and more users started to set higher DPI settings.

    The result: some programs were prepared and did it OK, other programs ignored the setting completely (they were programmed in pixels) and other programs didn't display well (most probably because they mixed pixels with inches (or twips in case of VB6).

    (Side note: In Windows 10, the setting is presented to the user directly as a scale, in percentage. They don't even mention the actual DPI.
    So this workaround of changing the DPI setting for enlarging the elements on the screen, even when the setting was not originally intended for that purpose, it became the de facto (and official) way of doing it.)

    Therefore in Windows Vista MS introduced a new setting: the DPI awareness.
    What is that?
    Programs, of course, are now aware or unaware. Who are aware or unaware is people.
    So if a program is set as "dpi aware" it means that someone, a person (in most case the programmer) is aware of this issue and "certifies" that the program runs well on any DPI setting. Or at least that you know about the issue and want to run the program on different DPI settings anyway, because you see that the program runs fine or for whatever reason.

    So, programs themselves are not aware or unaware, they can be, as said before in one of three categories:
    1) Prepared to enlarge things according to the setting.
    2) Ignores the setting an shows always the same regardless of the DPI setting (programs made all in pixels).
    3) Some things show well and others not. Most probably because of mixing inches (or twips) with pixels.
    The issues will be almost always graphical, but it is possible that a program could crash because it is not prepared for certain DPI setting (for example due to an overflow with a number or an element set to less than zero pixel), although I didn't experience such kind of issues with any program so far.

    For 1) and 2) you could set it (an old program, even if it is not yours) as "dpi aware", in case of 1) because it works correctly, in case of 2) because you don't care that it ignores the setting.
    For case 3) if the issues are minor, perhaps you could anyway, if they are not minor, then better don't set it as DPI aware because it won't display well.

    How to set a program as "DPI aware"

    For now refer to LaVolpe tutorial Being DPI aware.

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •  



Click Here to Expand Forum to Full Width