I'm passing a application to a web-based application, therefore I have to use ADO.net to access a sql server DB. But the performance is extremly poor!!!!
Just to compare: with the normal application or with the SQL Query Analyzer, the query needs about 2 seconds. Well, by using ado.net, it's about 200 seconds.....!!!!
Of course, the query is quite long. But the difference is extrem... too extrem for using the same query...
I already tried a lot... actually everything I found on the web. What is really strange is that neither the processor nor the memory is fully used... And the web server use more of the cpu power than the sql server itself (the 2 server are still on my machine where I'm programming). So, could it be that the problem is by filling up the dataset with the sqladapter or something similar?
Thx for your help!!
Two words, "stored procedure". If you query is long you're passing to the DB each time expensive in so many ways. Plus make sure you're re-using the connection by passing in exactly the same details...actually just use a stored procedure.
|||Thx pkr! I already read that point to improve the performance and I tried it. Unfortunately, it's not the solution. May be, it increases a little bit the performance, but it seems that the problem is elsewhere...What I found out till now:
- it's not the hardware
- it's not the connection
- it's not the amount of data sent between the web server and the db server
actually, other queries are executed quite fast and without this performance leak.
so, quite possible is that the problem is the query itself. Only the fact, that the execution time of the query by the query analyzer is 100x faster than over ADO.net can't be normal. There has to be something ... but where?
Any idea?
also, with other queries, the performance of ADO.net and the query analyzer is almost the same. So, where does this performance leak coming from?
|||thx rtk! It's not the first time I've read about the stored procedure and I tried it. Unfortunately, it's not the solution to my problem. It gives my an improvement, but just a tiny one.
The problem has to be elsewhere...
What I found out until now:
- it's not the hardware
- it's not the connection
- it's not the amount of data sent between the web server and the db server
So, it sounds like it's the query itself. What is so strange is the performance difference between the sql query analyzer and the ado.net. The query analyer executes the query 100x faster than through ado.net. It's just that other queries execute fast through ado.net.
Here the query:
SELECT table1.col1, table2.col1, count(table2.col2) as counter
FROM table3,table1, table2
WHERE table3.Valid = 1
AND table3.SessionId = table2.SessionId
AND table2.col4 *= table1.col2
AND table2.col3 IN ('01','02','04')
AND table3.SessionId IN (
SELECT SessionId
FROM userTable
WHERE UserName = 'userABC' )
GROUP BY table1.col1, table2.col1
comment: I know the "*=" is not the nicest solution (should be a right outer join), but I used it in other queries too and it executes fast.
any idea, what I can improve... or if it's really the query, how to write the query new?|||by the way, here the query:
SELECT table1.col1, table2.col1, count(table2.col2) as counter
FROM table3,table1, table2
WHERE table3.Valid = 1
AND table3.SessionId = table2.SessionId
AND table2.col4 *= table1.col2
AND table2.col3 IN ('01','02','04')
AND table3.SessionId IN (
SELECT SessionId
FROM userTable
WHERE UserName = 'userABC' )
GROUP BY table1.col1, table2.col1
comment: I know the "*=" is not the nicest solution (should be a rightouter join), but I used it in other queries too and it executes fast.
any idea, what I can improve... or if it's really the query, how to write the query new?|||If you're telling me that the self same query as a stored procedure is 100x slower than in QA then the only thing I can think of is that it's the initial compile time. So in QA in DBCC FreeProcCache and Drop Clean Buffers then run the QA again. This should put the two mechanisms on a level playing field. That wold then suggest that the query is too complicated (although it doesn't look it). I would also move to use "inner join" syntax and remove the sub query.|||I didn't understand exactly what you meant with the FreeProcCache. But isn't it a modification in the QA. It can't be that.
What I also thought of is that the priority of a .net data provider is not that high like when a processing the same query with the QA? Could that be? How to change?
I will try to use the inner join. Actually I did some research on the query and there are 2 parts which ones are slowing down the whole thing.
1. the join... of course
2. the IN ('02', ...)
But the point is, that there has to be something different because of having this huge performance difference between QA and ado.net with the same query. So, how does ad.net handle the query exactly? Where are the diferences between the 2 methods? ... There has to be something and it is really important to find that for me.
Thanks for you help, anyway!!|||
First, create the stored procedure. Then call it (the stored procedure) from both QA and ADO.NET and see what the time difference is.
I'd recommend updating your SQL Syntax as well, but that shouldn't make a difference between the two.
To your question, yes, there are certain differences between the connection settings be default in QA and ADO.NET. One or more of these settings may be the culprit.
|||Given a level playing field there are only a few minor differences between QA and a sqlClient call. The typical pitful of using QA is to do with the way SQL caches its query, and to some extent its data. I'll exaggerate to make the point. Consider you've developed a query that is complicated for SQL to compile a plan for, remember that a plan is cached for each user. In QA it take 60 secs to compile and cache the plan. So next time you run the query from QA the cache is ready and although the query is complex, it only take 1 second to run. Eureka you shout, and you make the call from your ADO client. However, you've decided to use a different connection string per user, e.g. via NT Authentication. So each time a new user runs the query for the first time you incur the 60 sec overhead.
The "free proc cache" et al, remove all the caches of procedures and data so next time you run the query in QA you can see if any of the above is the problem. They are command you can issue from QA, look up "DBCC" in the help for information on how to run them (they're very simple)...although don't do it on a production server!
sql
No comments:
Post a Comment