-
Mar 15th, 2017, 07:00 AM
#1
Thread Starter
New Member
RAM Sql Query VB.NET Application
I am working on a VB.NET App, where I retrieve data from a DB (MSSQL in WServer 2008 R2).
I retrieve data from two tables (procedures ~270,000 records & proceduresAddresses ~430,000 records).
When I retrieve the data in the app, the ram usage goes up to 400MB. When I do retrieve the same data with MSSQL Management Studio I doesnt show this ram usage anywhere.
So, now I am wondering if this is normal in my application or I did sth wrong.
I am not so long in DBs-World and I still dont really understand how the result of the queries are saved in an application.....maybe can someone explain more about this topic.
Thanks in advance and apologize for my bad english
Cheers, RG
-
Mar 15th, 2017, 08:21 AM
#2
Re: RAM Sql Query VB.NET Application
Without seeing any code then it is hard to say if you are doing things wrong or not....
If you are retrieving all these results and keeping them in memory (Dataset, collection etc.) then this is going to use a lot of RAM. If you are writing queries that restrict the data you get back then you will limit your RAM usage.
Management studio could either be limiting the data it retrieves at any one time (paging or a cursor or something similar) or it may not be keeping data in memory in the same way.
If you show some code though you might get better advice.
-
Mar 15th, 2017, 08:47 AM
#3
Re: RAM Sql Query VB.NET Application
SSMS will do some memory management, which you can see in action by scrolling through results as the query runs. You'll see it stop, wait, then dump out another chunk of records.... That's the server paging the data.
The other thing to note is you will see SQL Server ramp up some memory ON the SQL Server... it isn't clear if your local machine is also the SQL Server in this case, or if another machine is. Also, once results comes back, SSMS will display it, then dump the data, it doesn't keep it around in memory. Your app on the other hand, plays by some different rules. 1) my guess is that you're putting things into a DataSet or DataTables, which means it's going to be in memory somewhere until such time as you dump it as well. 2) Even after you dump the datatables, the memory is still allocated to them, so it will continue to look high but that's only allocation, not actual use. 3) Is there really a need to pull all 270k & 430k rows? Seems like an awful lot of data, most of which is probably never used. Ideally, you should only be pulling the rows you need when you need them. I work on systems that have millions of rows in them. I don't pull them all - ever. The user is only ever going to need 5-10 at any given time, each of whcih are loaded on demand. Even when they run a query in the system, with no filter, we limit them to 500 records - for a good reason.
-tg
Tags for this Thread
Posting Permissions
- You may not post new threads
- You may not post replies
- You may not post attachments
- You may not edit your posts
-
Forum Rules
|
Click Here to Expand Forum to Full Width
|