PHP User Warning: fetch_template() calls should be replaced by the vB_Template class. Template name: bbcode_highlight in ..../includes/functions.php on line 4197

PHP User Warning: fetch_template() calls should be replaced by the vB_Template class. Template name: bbcode_highlight in ..../includes/functions.php on line 4197
WinForm application performance issues-VBForums
Results 1 to 6 of 6

Thread: WinForm application performance issues

  1. #1

    Thread Starter
    Junior Member
    Join Date
    Jun 2017

    WinForm application performance issues

    Hi everyone,

    Writing here just to get some ideas/opinions on how to improve performance in a following scenario:

    I have a WinForms application built with .NET Framework 4.0 hosted on a dual core machine.
    The main task of this application is to check a remote FTP folder every 10 seconds for new XML files.
    Once a file is found, it is copied to local folder and then the entire XML content is saved into a single SQL Server table.
    Once saved, the file is then parsed "node by node" and data is saved into different SQL Server tables using "Linq to SQL" approach.

    The application uses two backgroundworkers. One to check for XML on the FTP server, and another one to kickoff the XML parsing and saving to DB.

    XML files are usually from 5kb to 300kb in size, so parsing them can take anywhere from 1sec to a few minutes.

    On high traffic days, this causes a problem since a lot of files can get queued up and that affects the rest of the system, as the data from the files is used to update other parts of the system.

    Now, I did not write this application, so my first attempt was to have multiple threads work on the parsing part.

    Once the backgroundworker enters the DoWork event, it loops through a list of xml files, kicking off ThreadPool.QueueUserWorkItem(callback,data).
    The actual parsing is done in the callback function, and in there I also use a semaphore to limit the number of threads executing the code. If I set the values to 2, it seems to works as it should, as I can see it starts processing two files, and then actually waits for any of them to finish in order to proceed with other files (if there are any).

    Now, my question here is is there anything else I could do to improve performance? Maybe upgrading to .NET 4.5 and using async in the actual DB communication?

    Thank you for any feedback.

  2. #2
    PowerPoster techgnome's Avatar
    Join Date
    May 2002

    Re: WinForm application performance issues

    if you've got the XML in a table in SQL Server, why not make SQL Server do the work? Why pull it back out and have VB do the work when there's native SQL Server methods to work with XML that can do the work for you? It can then all be done right in Stored Procedures.

    * I don't respond to private (PM) requests for help. It's not conducive to the general learning of others.*
    * I also don't respond to friend requests. Save a few bits and don't bother. I'll just end up rejecting anyways.*
    * How to get EFFECTIVE help: The Hitchhiker's Guide to Getting Help at VBF - Removing eels from your hovercraft *
    * How to Use Parameters * Create Disconnected ADO Recordset Clones * Set your VB6 ActiveX Compatibility * Get rid of those pesky VB Line Numbers * I swear I saved my data, where'd it run off to??? *

  3. #3
    Join Date
    Feb 2006

    Re: WinForm application performance issues

    Quote Originally Posted by BlackRiver1987 View Post
    Once saved, the file is then parsed "node by node" and data is saved into different SQL Server tables using "Linq to SQL" approach.
    A lot of people like to talk about "parsing the XML" when what they are actually doing is "walking the DOM." Just as with Linq to SQL, the XML DOM is a slow and bloated technology most useful in scripts where the size and volume of data is low. Add something like XPath queries and you have a perfect storm of poor performance.

    If you are just extracting data from nodes in the XML documents that can be done using a single forward pass over the document then a SAX parser can offer improved performance. The workingset (memory) penalty is far lower and the speed can be 4 to 5 times as fast. And then you have all of the overhead of .Net's interpreting, JITting, and huffing and puffing garbage collection.

    I'm not sure they ever added SAX2 to System.Xml but I think there are some examples of SAX-like parsing making use of XmlReader. Some searches may turn something up.

    As for doing the work at the database server goes it sounds like an even bigger potential problem. Normally you'd do enough data reduction at the client to reduce round-tripping overhead significantly compared to shipping bulk XML over to the server for processing. Then you have moved a large burden onto the database server, which may impact other applications that share the machine. I'm not sure what SQL Server offers to do this kind of custom XML processing, but if it just means running the same weaky-squeaky .Net over there you haven't gained much.

    Sounds like a job for unmanaged native-code C++ or VB6 using the MSXML implementation of SAX2. Of course you might be doing complex enough extraction that SAX techniques might be impractical, but even resorting to a DOM and even XPath should perform better.

    System.Xml had some nasty overhead, though it may have been addressed since I last worked with it. The DOM was designed for scripting scenarios such as ASP.Net so XPath queries tended to perform better than trying to walk the DOM making use of getElementsByTagName() calls:

    Use of GetElementsByTagName considered harmful

    I just ran a test in VB6 using MSXML SAX to parse a 1.7 MB (on disk) sample XML document containing 2048 "rows" of data.

    Displaying the extracted rows of data using MSXML DOM took 562 ms. Using SAX took 141 ms. Doing the same thing adding the rows to a local Jet MDB database SAX took 266 ms. And this is an old, slow machine with a slow HDD.

    SAX performance scales almost linearly: XML half the size takes just about half the time. Not so for DOM until the documents get pretty small (like around 100 KB).

    Do I expect you to use VB6? No.

    But you might look into things like XmlReader in System.Xml. I just don't know how many good tutorials and examples exist, or how much it gets used by Joe Lob Coder.
    Last edited by dilettante; Nov 6th, 2018 at 01:26 PM.

  4. #4

    Thread Starter
    Junior Member
    Join Date
    Jun 2017

    Re: WinForm application performance issues

    Thank you both for taking the time to answer.


    I did consider this option. And probably, if everything else fails I will have to take that road.
    But at the moment, I would like to avoid that because the server and DB are already under heavy load. The DB itself was created more than 10 years ago and already has SP's dealing with XML string parsing, service brokers firing of 1000's of messages in short periods of time and so on.
    I consider it a legacy system even though it still is the backbone of business.


    That's a really detailed answer. I went over the code once more and I believe following part of code is working fairly well:
    foreach file in files
     if file already saved in DB
      load the file to XElement object using XElement.Load(filepath)
      send XElement object to DB table (db.myTable.InsertOnSubmit; db.SubmitChanges)
    Performance of the above piece of code is acceptable. Depending on the size of the file, it takes a few seconds to save it to DB.
    Next part of code takes the XML and deserializes it into myComplexObject.
    Below is the code snippet that does that.

    c# Code:
    1. public static MyComplexObject Deserialize(FileInfo fi)
    2. {
    3.                 MyComplexObject data= null;
    4.                 XmlSerializer ser = new XmlSerializer(typeof(MyComplexObject));
    6.                 try
    7.                 {
    8.                     using (FileStream fs = fi.OpenRead())
    9.                         data = (MyComplexObject)ser.Deserialize(fs);
    10.                 }
    11.                 catch
    12.                 {
    13.                     using (TextReader tr = new StreamReader(fi.FullName, Encoding.UTF7))
    14.                     {
    15.                         string xmlString = tr.ReadToEnd();
    16.                         MemoryStream ms = new MemoryStream(Encoding.UTF8.GetBytes(xmlString));
    17.                         data = (MyComplexObject)ser.Deserialize(ms);
    18.                         ms.Close();
    19.                     }
    20.                 }
    22.                 data.FilePath = fi.Name;
    23.                 using (var reader = fi.OpenText())
    24.                     data.Xml = reader.ReadToEnd();
    25.                 data.EnsureRelationships();
    26.                 return data;
    27. }

    After this function returns, I have the entire object to work with.
    This object has several other objects encapsulated and each of those can have other objects.
    Every object has a SaveChanges method that most of the time looks like this:

    c# Code:
    1. public void SaveChanges(dbDataContext db)
    2.         {
    3.             var cat = db.Categories.FirstOrDefault(x => x.CategoryId == this.CategoryId);
    4.             if (cat == null)
    5.             {
    6.                 cat = new MyData.Data.Category()
    7.                 {
    8.                     CategoryId = this.CategoryId,
    9.                     SportId = this.SportId
    10.                 };
    11.                 db.Categories.InsertOnSubmit(cat);
    12.             }
    13.             cat.SportId = this.SportId;
    14.             cat.Title = this.ToString();            
    15.             if (this.Tournaments != null)
    16.             {
    17.                 foreach (Tournament t in this.Tournaments)
    18.                 {                    
    19.                     t.SaveChanges(db, cat);
    20.                 }
    21.             }
    22.             base.SaveChanges(db, "Category", this.categoryId);
    23.         }

    At some point db.SubmitChanges is called and that kicks off the process of pushing data to DB. I believe the queries used are not very optimized and the fact that the DB is on another server just adds to the problem.

    I do believe that if I can optimize the DB communication, I might see some improvement.

    Thank you once again

  5. #5
    Join Date
    Feb 2006

    Re: WinForm application performance issues

    I'd assumed you had already tried code profiling and found that processing the XML was your bottleneck. Sorry, I must have read your question incorrectly. If you have determined that network throughput between client and server is the limiting factor then very little of what I said above applies.

  6. #6
    Join Date
    Jun 2015

    Re: WinForm application performance issues

    Have you profiled the code? how long is Deserialize taking?

Posting Permissions

  • You may not post new threads
  • You may not post replies
  • You may not post attachments
  • You may not edit your posts


Click Here to Expand Forum to Full Width