Results 1 to 16 of 16

Thread: Can VB6 app benefit from GPU?

  1. #1

    Thread Starter
    Addicted Member
    Join Date
    Jan 2012
    Posts
    245

    Can VB6 app benefit from GPU?

    Hi,

    My app does a lot of graphical drawing using various Windows' API's from GDI (for schematics like lines, rectangles, fills, circles, etc.) and GDI+ (for images in e.g. JPG or PNG with transparancy).

    Is it possible with VB6 to take advantage of a dedicated videocard/GPU (e.g. NVIDIA) if that is available in the PC the app is running on? And if so, is there something the app needs to do, or is it done automatically?

    Thx,
    Erwin

  2. #2
    The Idiot
    Join Date
    Dec 2014
    Posts
    2,721

    Re: Can VB6 app benefit from GPU?

    gdi32 uses gpu. bitblt/alphablend
    but if u want more power, directx, d2d, opengl

  3. #3
    Addicted Member ISAWHIM's Avatar
    Join Date
    Jan 2023
    Posts
    181

    Re: Can VB6 app benefit from GPU?

    Quote Originally Posted by Erwin69 View Post
    Is it possible with VB6 to take advantage of a dedicated videocard/GPU (e.g. NVIDIA) if that is available in the PC the app is running on? And if so, is there something the app needs to do, or is it done automatically?
    Windows is built on Direct-X, so many of the old GDI and GDI+ functions, to a small extent, will use GPU acceleration to draw.
    (To elaborate, Windows uses the same GPU calls, for itself, which DX/GL uses. I am not saying that windows is running in DX. They have been converting many of the old graphics functions to use the same functions that they can. If that makes sense. Nothing worse than when a GPU gets an update and you start seeing "gaming issues" in normal windows now.)

    However, that is not 100% on every function and not in every situation. For true dedicated graphics drawing you need to use DX, GL, or an SDK for "windows", and/or "Nvidia" and/or "Intel-graphics", and/or "AMD". Or, use a wrapper of some kind.

    If you look at your GPU processes in "Task Manager", [Performance]->[GPU], while doing something like heavy BitBlt, or some GDI+ functions. You will see 3D use rocket to life and also see your GPU memory spike quite a bit. (This is not "Windows activity", it is actual GPU assisted processing, for the windows that you are using GDI/GDI+ within. Not something you can actually control, directly. Unlike with DX, GL and/or an SDK for a specific card type.)

    NOTE: You really have to be doing a LOT to see real action.

    I have a few GPUs... this is showing my BitBlt alpha-blend demo running, which raises my GPU processes a "little". Not quite as much as when I am playing a DX game, but close-enough. (For one single GDI call, done a million times.)



    P.S. If you want an EASY way to get 3D GPU acceleration in your program. Use the BrowserControl and use HTML5 CANVAS, which uses 3D acceleration, by default. Fairly easy to use, with a massive set of samples online. Bonus because you can easily adapt your program to Javascript and go full-app or full-online, from that point.
    Last edited by ISAWHIM; Mar 21st, 2023 at 05:08 PM.

  4. #4
    The Idiot
    Join Date
    Dec 2014
    Posts
    2,721

    Re: Can VB6 app benefit from GPU?

    well its not just about gpu-acceleration, that for the moment only GDI32 is using.
    its about the "memory" used for the purpose.
    when I use d2d, the "pictures" are loaded into the GPU-memory
    when I use gdi, the "pictures" are loaded into the CPU-memory

    I can however switch software <> hardware when using d2d.
    and what happens? I can not use the monitor-sync anymore, instead I need a coded-timer.

    that is why bitblt/alphablend/stretchblt are NOT equal to any d2d-equivalent in performance.
    and stretchblt u can notice it immediately if u try to stretch it. and do not use any halftone (that I think its cpu-based).

  5. #5
    Addicted Member ISAWHIM's Avatar
    Join Date
    Jan 2023
    Posts
    181

    Re: Can VB6 app benefit from GPU?

    Yes, that is about manually using them, forcing them into VRAM. (Though you are at the mercy of your own code.) Windows WILL load, what it needs, into VRAM. Otherwise it wouldn't display the image. It's not displaying an image from RAM, it's in VRAM, encoded. All GDI actions take place in virtual-ramspace, which is now reverse "shared memory" with the GPU. Image data, if it is processable faster by a GPU command, will go directly to GPU memory-space. Otherwise, it is using software processing and system RAM, which then goes to the GPU after it is processed, for display.

    Windows uses GPU memory and "commands", to assist with "general processing functions" as well. It's not just about GPU's for graphics anymore. They are up to 4000x faster than a CPU, at almost everything they do, which is why they are used for almost everything math-related now. (Which is about 90% of what most programs are doing. Complex math, and graphics, when not doing "text processing".)

    Sound processing (mp3/ogg/flac/wav), Video processing (avi/divx/xvid/mp4/h###), Graphics (dx/gl/gdi), most 3D related math functions (cuda), almost all "matrix math" (cuda/tensor), anything VR related (multi-view virtual-display), even simple OS RAM/HD acceleration (Copy 0-6), Security (encryption/decryption).

    Name:  Image1.jpg
Views: 344
Size:  30.0 KB

    So, as I said... You WILL get GPU acceleration, doing "nothing special". You can get MORE, doing special things, or less... If you don't know what you are doing. You may just be wasting time pushing things into GPU processing that gain no speed, like "line drawing" and "individual pixel manipulation". Or anything that requires "System RAM Memory Manipulation" as part of the function. Or, as mentioned, anything "text related", to processing, not drawing text. All fonts "glyphs" are now handled in GPU memory-space.

    P.S. Windows has even made VB6 use multiple cores and threads now, without needing to manually tell it to use them. (You can still do that too, manually, if needed.) The red line is where I stopped the demo program. It was doing nothing but looping through 10,000,000 GDI (BitBlt) and simple math functions. One core clearly has dominance, followed by another doing parallel processing, followed by 9 more doing tiny things. 11 cores total, were being used by VB6, running the compiled program. (In all fairness, I am about 80% sure that 4-6 of those were "cleaning-up stuff", not directly used for code-processing.)

    Name:  Image1.jpg
Views: 268
Size:  61.3 KB
    Last edited by ISAWHIM; Mar 22nd, 2023 at 09:05 AM.

  6. #6
    The Idiot
    Join Date
    Dec 2014
    Posts
    2,721

    Re: Can VB6 app benefit from GPU?

    Im doing games and I know a bit of performance.
    as I wrote. bitblt/alpha/stretchblt uses GPU-acceleration, but the hdc is still in CPU-memory.
    if u use anything else GDI32/GDI+ it will be calculated in the CPU-memory sphere.
    d2d if initialized in hardware, it will use GPU-memory and perform operation directly in the "bitmap" that is located in gpu.
    that is why I don't use a hdc or bitmap(gdi+) because those 2 are located in the CPU-memory.
    no matter if those 3 GDI32 commands have gpu-acceleration, everything else is locked to the "surface" of the CPU-memory.
    sure we have vram, the buffer-memory that, once u run it "once" it can recall it again faster.

    but even so, u can NOT compare cpu-based operation with gpu-based. its ridiculous.
    I have been doing games for a long time and I have used GDI to the limits so I know what works and not.
    and even with the gpu-acceleration I was limited of what I could do, as I wrote, upscaling/interpolation, all that is so slow that I can not use it at all.
    while in d2d, theres almost no change in gpu-usage when upscaling with interpolation applied.

  7. #7
    PowerPoster PlausiblyDamp's Avatar
    Join Date
    Dec 2016
    Location
    Pontypool, Wales
    Posts
    2,458

    Re: Can VB6 app benefit from GPU?

    Quote Originally Posted by ISAWHIM View Post
    P.S. Windows has even made VB6 use multiple cores and threads now, without needing to manually tell it to use them. (You can still do that too, manually, if needed.) The red line is where I stopped the demo program. It was doing nothing but looping through 10,000,000 GDI (BitBlt) and simple math functions. One core clearly has dominance, followed by another doing parallel processing, followed by 9 more doing tiny things. 11 cores total, were being used by VB6, running the compiled program. (In all fairness, I am about 80% sure that 4-6 of those were "cleaning-up stuff", not directly used for code-processing.)
    Activity spread over multiple cores is not the same as multiple threads. A single threaded application will show that behaviour, a thread under windows isn't locked to a specific core.

    Unless Taskmanager or similar is showing the application as having multiple threads then it is still running as a single thread.

  8. #8
    Addicted Member ISAWHIM's Avatar
    Join Date
    Jan 2023
    Posts
    181

    Re: Can VB6 app benefit from GPU?

    Quote Originally Posted by PlausiblyDamp View Post
    Unless Taskmanager or similar is showing the application as having multiple threads then it is still running as a single thread.
    I get 3-7 threads, depending what program I launch. 150-1000 handles, at times. (I assume hyper-threading?)



    Could be one thread for the APP, one for the window and one for the title-bar... (By the way, this specific time-demo hates to be running compiled, while task-manager is running too.. Causes it to hang at the end of the test. "Not responding".)

    The only other thing I can think is that each "picturebox" and "textbox" or "RichTextBox" might also have it's own "thread", in this new setup they use. I normally have a few pictureboxes or picture objects, for back-buffering. This specific program has two back-buffers. (??? The three individual threads ???) Only one is visible.
    Last edited by ISAWHIM; Mar 22nd, 2023 at 02:57 PM.

  9. #9
    Angel of Code Niya's Avatar
    Join Date
    Nov 2011
    Posts
    8,598

    Re: Can VB6 app benefit from GPU?

    Quote Originally Posted by ISAWHIM View Post
    Windows uses GPU memory and "commands", to assist with "general processing functions" as well. It's not just about GPU's for graphics anymore. They are up to 4000x faster than a CPU, at almost everything they do, which is why they are used for almost everything math-related now.
    It is essential to exercise caution when making assertions such as this. CPUs and GPUs excel in distinct areas, and one cannot simply replace the other. GPUs hold an advantage over CPUs only for calculations that can be parallelized.

    Algorithms that necessitate a specific order of execution are not suitable for implementation on a GPU, and this encompasses the majority of code in existence. These algorithms execute one task followed by another, with each subsequent task dependent on the previous one. In contrast, algorithms that allow for steps to be executed in any order can be effectively implemented on a GPU.

    A prime example of this is the process of darkening an image. The darkening of one pixel is independent of the darkening of other pixels, enabling all pixels to be darkened simultaneously. This characteristic makes such operations ideally suited for GPUs.
    Treeview with NodeAdded/NodesRemoved events | BlinkLabel control | Calculate Permutations | Object Enums | ComboBox with centered items | .Net Internals article(not mine) | Wizard Control | Understanding Multi-Threading | Simple file compression | Demon Arena

    Copy/move files using Windows Shell | I'm not wanted

    C++ programmers will dismiss you as a cretinous simpleton for your inability to keep track of pointers chained 6 levels deep and Java programmers will pillory you for buying into the evils of Microsoft. Meanwhile C# programmers will get paid just a little bit more than you for writing exactly the same code and VB6 programmers will continue to whitter on about "footprints". - FunkyDexter

    There's just no reason to use garbage like InputBox. - jmcilhinney

    The threads I start are Niya and Olaf free zones. No arguing about the benefits of VB6 over .NET here please. Happiness must reign. - yereverluvinuncleber

  10. #10
    PowerPoster yereverluvinuncleber's Avatar
    Join Date
    Feb 2014
    Location
    Norfolk UK (inbred)
    Posts
    2,235

    Re: Can VB6 app benefit from GPU?

    There are some jobs you simply want to hand over to other threads, tasks that don't need to slow the main operation down. In the future I am hoping for these sort of tasks I can pop their code into a new thread initiated and marshalled correctly usingTwin/RadBasic/SchmidtBasic (whichever).

    One could create another binary I suppose and call that just to do a discrete function such as saving parameters that can then be forced to occur asynchronously. You might have to create a shared memory location for that and open your main program to the new binary so that it has access to the updated params.

    Then there are those operations that are just handled efficiently by the GPU. I am hoping to take advantage of GPU optimisation in my own program later on but for now I am just trying to optimise CPU usage during animation using GDI+. I am watching these sort of threads avidly for when I 'upgrade' to Direct2D.
    https://github.com/yereverluvinunclebert

    Skillset: VMS,DOS,Windows Sysadmin from 1985, fault-tolerance, VaxCluster, Alpha,Sparc. DCL,QB,VBDOS- VB6,.NET, PHP,NODE.JS, Graphic Design, Project Manager, CMS, Quad Electronics. classic cars & m'bikes. Artist in water & oils. Historian.

    By the power invested in me, all the threads I start are battle free zones - no arguing about the benefits of VB6 over .NET here please. Happiness must reign.

  11. #11
    PowerPoster PlausiblyDamp's Avatar
    Join Date
    Dec 2016
    Location
    Pontypool, Wales
    Posts
    2,458

    Re: Can VB6 app benefit from GPU?

    Quote Originally Posted by ISAWHIM View Post
    I get 3-7 threads, depending what program I launch. 150-1000 handles, at times. (I assume hyper-threading?)
    Hyper-threading is a hardware thing, it is a way of making one physical core behave as two virtual cores (very simplified explanation - https://en.wikipedia.org/wiki/Hyper-threading gives a bit more of the technical side)

    Quote Originally Posted by ISAWHIM View Post
    Could be one thread for the APP, one for the window and one for the title-bar... (By the way, this specific time-demo hates to be running compiled, while task-manager is running too.. Causes it to hang at the end of the test. "Not responding".)

    The only other thing I can think is that each "picturebox" and "textbox" or "RichTextBox" might also have it's own "thread", in this new setup they use. I normally have a few pictureboxes or picture objects, for back-buffering. This specific program has two back-buffers. (??? The three individual threads ???) Only one is visible.
    Unless your application is written to use multiple threads, or the framework it is based on uses multiple threads, then windows itself won't just make the app multithreaded. Even creating a thread for the Window would cause problems due to cross threading calls, ownership of GUI resources etc.

  12. #12
    Addicted Member ISAWHIM's Avatar
    Join Date
    Jan 2023
    Posts
    181

    Re: Can VB6 app benefit from GPU?

    Quote Originally Posted by PlausiblyDamp View Post
    Unless your application is written to use multiple threads, or the framework it is based on uses multiple threads, then windows itself won't just make the app multithreaded. Even creating a thread for the Window would cause problems due to cross threading calls, ownership of GUI resources etc.
    Well, there is nothing in that program using threads. Clearly, windows has assigned multiple threads to the program. I didn't. Like I said, it's been doing this since the latest updates, in windows-10, after Win-11 was released. Originally I thought it was windows creating a 32-bit VENV. But then a lot of the older DLL's, I heard, are now using pointers to other newer DLLs, replacing whole functions within them. (GDI was one of those. WebControl was another. RichTextBox was another.)

    I wouldn't know where to begin to "dig deeper" into what windows/vb6 is stuffing into the threads. I am sure that can be figured-out. Same with what things it is clearly sending to the Video-card for "3D" processing, when the only graphic call I am using is a simple "GDI AplhaBlend" function, and "GDI BitBlt" function. Two things, which previously, were not processed in the GPU-Space and memory. (Except for display. But the volume of processing being done is NOT "just display" processing. I can shift images from RAM {objects in memory}, into a picture-box, and that doesn't make my GPU "3D Processing" flinch.)

    I am also getting 1,800 frames per second (600px x 240px) AlphaBlending(0-255 shades) + BitBlt(to restore the original back-buffer for blending). Previously, when this demo was run on an older version of Windows-10, it was barely pushing 200 frames per second for this dual-call combo.

    There is one more call, to restore the original canvas, every 1:100 after the 0-255 "blended shades".

    Not one "thread code line" anywhere in this program, or "direct-x" or "openGL". Just raw GDI API calls and a few basic math formulas and logic comparisons. (Sorry for all the comments. I do that a lot when doing time-demos and comparisons, for various reminders.)

    NOTE: For some reason, this loop hangs after execution, when compiled. (Waiting for the GPU or "Thread" to release?) Doesn't do that when running through the IDE compile. Hangs longer if Task-Manager is open. It use to crash if Task-Manager was open, silently. (Dead app check?)

    Also, that "loop" was just to try to slow it down... I was trying to stop it from crashing or hanging. I forget what I did to resolve the crashing, but it still "hangs".

    Code:
    Option Explicit
    
    Private Declare Function GetTickCount Lib "kernel32" () As Long
    
    Private Type RectAPI
        Left As Long
        Top As Long
        Right As Long
        Bottom As Long
    End Type
    
    ' Poly-drawing API functions...
    ' StretchBlt, StretchDIBits
        
    Private Declare Function BitBlt Lib "gdi32" ( _
        ByVal hDestDC As Long, _
        ByVal x As Long, ByVal y As Long, _
        ByVal nWidth As Long, ByVal nHeight As Long, _
        ByVal hSrcDC As Long, _
        ByVal xSrc As Long, ByVal ySrc As Long, _
        ByVal dwRop As Long) As Long
    
    Const AC_SRC_OVER = &H0
    Private Type BLENDFUNCTION
        BlendOp As Byte
        BlendFlags As Byte
        SourceConstantAlpha As Byte
        AlphaFormat As Byte
    End Type
    
    Private Declare Sub CopyMemory Lib "kernel32" Alias "RtlMoveMemory" (lpDest As Any, lpSource As Any, ByVal cbCopy As Long)
    ' 32bpp only, Lots of notes... Needs preblended alpha?
    ' https://learn.microsoft.com/en-us/windows/win32/api/wingdi/ns-wingdi-blendfunction
    Private Declare Function AlphaBlend Lib "msimg32" ( _
        ByVal hDestDC As Long, _
        ByVal x As Long, ByVal y As Long, _
        ByVal nWidth As Long, ByVal nHeight As Long, _
        ByVal hSrcDC As Long, _
        ByVal xSrc As Long, ByVal ySrc As Long, _
        ByVal widthSrc As Long, ByVal heightSrc As Long, _
        ByVal BLENDFUNCT As Long) As Boolean
    Private Declare Function GdiAlphaBlend Lib "gdi32" ( _
        ByVal hDestDC As Long, _
        ByVal x As Long, ByVal y As Long, _
        ByVal nWidth As Long, ByVal nHeight As Long, _
        ByVal hSrcDC As Long, _
        ByVal xSrc As Long, ByVal ySrc As Long, _
        ByVal widthSrc As Long, ByVal heightSrc As Long, _
        ByVal BLENDFUNCT As Long) As Long
    
    Dim AlphaBF As Long
    Const USE_BITMAP_ALPHA = &H1000000 'AC_SRC_ALPHA scaled up to the 4th byte of a long
    'AlphaBF = 128 * &H10000  'semi transparent ignoring bitmaps alpha channel
    'AlphaBF = 255 * &H10000 Or USE_BITMAP_ALPHA 'fully opaque using bitmaps alpha channel
    
    Private Sub Command1_Click()
        Dim i As Long
        Dim x As Long
        Dim y As Long
        Dim oldT As Long
        'Dim BF As BLENDFUNCTION
        'With BF
        '    .BlendOp = AC_SRC_OVER ' Value is always 0 here
        '    .BlendFlags = 0
        '    .SourceConstantAlpha = 1
        '    .AlphaFormat = 0
        'End With
        
        oldT = GetTickCount()
        
        ' Copies the "original image" to a "copy" for restoring the original, when needed
        Call BitBlt(pScreenCopy.hDC, 0, 0, pScreen.Width, pScreen.Height, pScreen.hDC, 0, 0, vbSrcCopy)
        ' Had to refresh, because bitblt doesn't trigger an actual paint event
        pScreenCopy.Refresh
        ' This just creates the "Solid black surface", used to darken the screen, placed in a buffer, because it was there.
        Call BitBlt(pBuff.hDC, 0, 0, pScreen.Width, pScreen.Height, pScreenCopy.hDC, 0, 0, vbBlackness)
        ' Sets the actual "result" into the picturebox's picture, since this was a bitblt creation
        pBuff.Picture = pBuff.Image
        
        y = oldT + 1
        ' Change this to 60, if you uncomment the "DO LOOP"
        For x = 1 To 100 '1000001
            For i = 1 To 254
                '##################################
                ' Forced delay to "stall" the loop
                'Do Until GetTickCount > y
                '    i = i + 1 - 1
                '    DoEvents
                'Loop
                'y = y + 1
                '##################################
                ' Original "custom type" conversion into an actual LONG
                'BF.SourceConstantAlpha = i
                'CopyMemory AlphaBF, BF, 4
                ' Faster method to convert needed values into a LONG
                AlphaBF = RGB(0, 0, i)
                ' Changed to CALL, in hope that it "disposed" of the return values faster
                ' Applies the "original image COPY" to the visible screen. (Not working in the buffer here, for time-demo)
                Call BitBlt(pScreen.hDC, 0, 0, pScreen.Width, pScreen.Height, pScreenCopy.hDC, 0, 0, vbSrcCopy)
                ' Does the "blend" of the black image in the BUFFER, onto the "original" image in the display
                ' This is the source of the "crashing". I can do a billion normal bitblt calls without crashing
                ' Something in the alphablend or gdialphablend is causing the crash
                ' I didn't use gdialphablend, because it returns a LONG, which holds specific info about "failures".
                ' However, this "crash" doesn't return anything, it just dies silently in the DLL, I assume.
                ' Hoped that returning less info would let it live longer, but it doesn't and it's not any faster or slower
                Call AlphaBlend(pScreen.hDC, 0, 0, pScreen.Width, pScreen.Height, pBuff.hDC, 0, 0, pBuff.Width, pBuff.Height, AlphaBF)
                ' Only needed in the time-demo, to "see" the results on the "original" image
                pScreen.Refresh
            Next i
            ' Just restores the original picture after the "time-demo" is done, for a second run
            Call BitBlt(pScreen.hDC, 0, 0, pScreen.Width, pScreen.Height, pScreenCopy.hDC, 0, 0, vbSrcCopy)
            pScreen.Refresh
        Next x
        
        Form1.Caption = FormatNumber(CDbl(GetTickCount - oldT) / 1000, 2) & " sec"
    End Sub
    
    Private Sub Form_Load()
        Form1.Height = 4485
    End Sub
    Maybe someone with actual knowledge, unlike myself, could shed some light to the subject. I just know it's "accelerated" and "threaded" and "using multiple cores", but not by "my instruction". I am honestly a spectator, taking whatever BS that I find online, if it sounds plausible, as "layman's-gospel".

    Like I KNOW these two AlphaBlend functions are the same NOW, because Windows said so, in the help-files. :P Not sure why only ONE returns a value and the other doesn't. Guess they just throw it out, along the way.

    The function in "msimg32.dll" is just a pointer-redirection to the function in "gdi32.dll". Which I NOW (as in, "today", assume is another pointer to some Direct-X/OpenGL "AlphaBlend" function, in the GPU. Explaining the sudden "hang", "acceleration", "threads" and "crashes" I was/am getting.)

    Reminds me of the old days of Direct-X... If you didn't have it setup correctly, you would see a blue screen where the game should be. Windows was just drawing a "target" for the video-card to find on the screen, where it would over-lay the DX images onto that screen-space. Print-Screen would only show the blue screen, for YEARS, until they moved that to DX too. Worked great to stop video pirating though, until they fixed it. The same blue as the BSOD color. :P

    Voodoo III-3000 Loved that card!
    Attached Files Attached Files
    Last edited by ISAWHIM; Mar 23rd, 2023 at 11:48 PM.

  13. #13
    PowerPoster Elroy's Avatar
    Join Date
    Jun 2014
    Location
    Near Nashville TN
    Posts
    9,853

    Re: Can VB6 app benefit from GPU?

    To my way of thinking, the power of the GPU lies in its ability to build a library of mesh objects and UV textures, and then to allow for the building of two buffers (back buffer & front buffer). Also, it wonderfully handles various lighting effects and camera rotation and translation effects, using its mesh & texture libraries, and specified lighting effects, to build the back buffer.

    I suppose there's more to it than that (or there wouldn't be a Direct2D library), but that's always been my thinking.

    Also, maybe most of that is actually DirectX (or OpenGL), rather than the actual GPU hardware. But the GPU hardware is certainly used to get the vast majority of that done.

    For me, it's difficult to say which libraries are going to go through the work of figuring out what kind of GPU resources are available. Certainly, DirectX, OpenGL, and I'm sure GDI+ do. But I wouldn't have thought the standard GDI does (but I don't really know). It can't be trivial to figure out what level of GPU support is available, and then redirect calls to make use of that.

    Regarding the OP's question, my answer would be, "it all depends on which libraries we're making use of."
    Last edited by Elroy; Mar 24th, 2023 at 11:28 AM.
    Any software I post in these forums written by me is provided "AS IS" without warranty of any kind, expressed or implied, and permission is hereby granted, free of charge and without restriction, to any person obtaining a copy. To all, peace and happiness.

  14. #14

    Thread Starter
    Addicted Member
    Join Date
    Jan 2012
    Posts
    245

    Re: Can VB6 app benefit from GPU?

    Thanks everybody for the insights!

    It turns out that I’ve always incorrectly thought that the GPU was automatically invoked when doing "graphical things".

    My "graphical things" are not that complex, and break down in

    1. Load jpg/png images in memory in a collection using GdipLoadImageFromStream
    2. Draw these images in different sizes on a PictureBox
    3. Draw rectangles with or without a color fill
    4. Draw texts on the rectangles

    After analyzing the code in depth and optimizing where possible with the current approach, I know that the most time consuming activities are 2 and 4. Step 1 could take some time, as there are situations where the user loads 1000+ images with a single click, but in principle is a one-time action.

    Re #2, the images are loaded in memory as-is, and then are drawn on the PictureBox using GdipDrawImageRectI. Depending on the original size and the on-screen zoom-factor, the image has to be stretched or reduced in size. (Most of the time reduced, as users often don’t think about the consequences of image size.)

    Re #4, part of drawing the texts is calculating the font size that has to be used to make as much text fit in the rectangle. This is done using GDI+ functions like GdipCreateFontFromLogfontW and GdipMeasureString, starting with an initial font size, comparing the required space with the available, and then reducing the font size step by step until the best size, or the minimum size has been reached.

    Optimizing the code has already given me significant reductions in time needed to redraw, 55% to 97%, but since some actions are repeated many times, I was hoping to find more ways to speed things up, e.g. using the GPU.

  15. #15
    The Idiot
    Join Date
    Dec 2014
    Posts
    2,721

    Re: Can VB6 app benefit from GPU?

    if u want to work in GDI, u should use GDI32 (alphablend/bitblt/stretchblt) and to do that u create memoryDC, not bitmaps (that are created by GDI+)
    GDI32 is much much faster. but its also more primitive.
    for "text", I recommend "bitmap-fonts". that will increase speed a lot.
    and use double-buffer. meaning, u have a memoryDC that u draw everything and when done, u draw that to a picturebox.
    everything can be found in this site. google/search.

  16. #16
    Angel of Code Niya's Avatar
    Join Date
    Nov 2011
    Posts
    8,598

    Re: Can VB6 app benefit from GPU?

    Quote Originally Posted by Elroy View Post
    To my way of thinking, the power of the GPU lies in its ability to build a library of mesh objects and UV textures, and then to allow for the building of two buffers (back buffer & front buffer). Also, it wonderfully handles various lighting effects and camera rotation and translation effects, using its mesh & texture libraries, and specified lighting effects, to build the back buffer.
    You're giving GPUs way too much credit. A GPU does one thing and one thing only, it crunches numbers. That's all it does. The thing that makes them so "magical" is the fact that it can perform thousands of these number crunches simultaneously. It also just so happens a lot of graphical operations a highly parallelizable. In fact, this is why GPUs are designed the way they are, to facilitate the processing of graphical data. Of course later on people started using GPUs to perform all kinds of parallel calculations that have nothing to do with graphics.

    When you're processing textures and meshes, your GPU really doesn't know what these things are. These abstractions only exist in libraries like OpenGL and DirectX. When it gets to the GPU, all it sees are computation algorithms and input data in its memory to run these computations on. OpenGL, DirectX, and so on, are what give the data some meaning.
    Treeview with NodeAdded/NodesRemoved events | BlinkLabel control | Calculate Permutations | Object Enums | ComboBox with centered items | .Net Internals article(not mine) | Wizard Control | Understanding Multi-Threading | Simple file compression | Demon Arena

    Copy/move files using Windows Shell | I'm not wanted

    C++ programmers will dismiss you as a cretinous simpleton for your inability to keep track of pointers chained 6 levels deep and Java programmers will pillory you for buying into the evils of Microsoft. Meanwhile C# programmers will get paid just a little bit more than you for writing exactly the same code and VB6 programmers will continue to whitter on about "footprints". - FunkyDexter

    There's just no reason to use garbage like InputBox. - jmcilhinney

    The threads I start are Niya and Olaf free zones. No arguing about the benefits of VB6 over .NET here please. Happiness must reign. - yereverluvinuncleber

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts
  •  



Click Here to Expand Forum to Full Width