Showing posts with label proc. Show all posts
Showing posts with label proc. Show all posts

Monday, March 26, 2012

Performance questions

We have a production database that sits on a 4 proc server with 4 GB of memory and SAN disk storage via fiber. There are some stored procedures that run and they take approximately 10 minutes to run. A developer has SQL Server installed on his local pc that has 1 2.5 GHz processor and 2 GB of memory and the stored procedures run in approximately 2 minutes. I have updated statistics and rebuilt indexes to no avail. He is questioning why it runs so much faster on his smaller pc compared to the production environment. I have monitored CPU, Memory, and Disk Queue Length and none of these performance counters look concerning to me while the stored procedures are running.

Can anyone out there give me some input on what I could check to figure out why we are experiencing this performance difference?

Thanks,

Corey

Do both databases have the same volume of data? You did not mention what speed and type of processors were in the server. If it is an older server, with say, 1GHz Pentium 3 based Xeon CPU's, you could easily see much slower query performance than on a developer's workstation, with a single faster CPU.

You should also look at the graphical execution plan for the query on the server, and see whether or not the plan is being parallelized or not. BTW, what version of SQL Server are you running?

|||

The data is the same between the two environments. His pc has a 2.5 GHz Intel processor and our server has 4 2.7 GHz Intel processors. We are running SQL Server 2000 SP3a. I will try to look at the queries but their are hundreds. Should the query plan be the same since I ran update statistics, rebuild indexes, and then the process so that I was comparing as each as close as possible.

|||

Just pick two or three commonly executed queries and run them in Query Analyzer with the graphical execution plan turned on and with SET STATISTICS IO on (just run that statement before you run the queries).

Do you see similar query plans and I/O statistics between the developer's workstation and your server?

Are we comparing a developer's workstation with no load with a production server with a full-work load?

If so, I would start looking at the production server to see if you see signs of CPU pressure, memory pressure, I/O bottlenecks, etc.

|||

I will try to get a couple of queries and compare them as suggested.

The developer's workstation has no load and the production server does have other work going on but this server does have plenty of resources left and it is a much larger box than the developer's workstation. I will go ahead and grab some perfmon counters to verify this too.

sql

Monday, March 12, 2012

Performance problem

Hi All,
I'm investigating a performance problem and I don't really know if the
following stored proc is ok or if it could be written otherwise. One of the
parameter is a list of eventypes in XML used as filter. If the xml is empty,
then it means "all events", explaining the reason of the second part of the
where clause.
Another way is to use a join with the tempory table but then I still need to
deal with the "all events" case with a UNION or a test on the emptiness of
the temporary table
My question is : is it ok as it is written even in case of large tables ?
How could it be written otherwise ?
PS : the xml passed as parameter is always relatively short, containing only
max 30 entries.
Here is the SP :
CREATE PROCEDURE [dbo].[anet]
@.NetID uniqueidentifier
,@.EventTypesXML ntext
AS
--=====================
-- Prepare event types
--=====================
DECLARE @.docEventTypes int
EXEC sp_xml_preparedocument @.docEventTypes OUTPUT, @.EventTypesXML
SET NOCOUNT ON
DECLARE @.RMC_EventTypes TABLE(
EVNT_RT_EVTY smallint NOT NULL)
INSERT INTO @.RMC_EventTypes
SELECT EVNT_RT_EVTY
FROM OPENXML (@.docEventTypes, 'EventTypes/EventType')
WITH (EVNT_RT_EVTY smallint '@.number')
SET NOCOUNT OFF
EXEC sp_xml_removedocument @.docEventTypes
--=========================
-- Retrieve requested data
--=========================
SET ROWCOUNT 100
SELECT EVNT_RT_EVTY AS EvntType
,ISNULL(PROD_CUSTNAME, PROD_RT_PRTY) AS ProdName
FROM Event AS RMCEvent
INNER JOIN Product ON ((PROD_ID = EVNT_LT_PROD) AND (PROD_ID = @.NetID))
WHERE (EXISTS ( -- Filter on event types
SELECT 1
FROM @.RMC_EventTypes AS RMC_EventTypes
WHERE (RMCEvent.EVNT_RT_EVTY = RMC_EventTypes.EVNT_RT_EVTY)
)
OR NOT EXISTS (
SELECT 1
FROM @.RMC_EventTypes AS RMC_EventTypes
WHERE (RMCEvent.EVNT_RT_EVTY <> RMC_EventTypes.EVNT_RT_EVTY)
)
)
ORDER BY EVNT_TIMESTAMP DESC
FOR XML AUTO, ELEMENTS
SET ROWCOUNT 0
GO"Christian Staffe" <x@.y.z> wrote in message
news:43ca2b25$0$29458$ba620e4c@.news.skynet.be...
> Hi All,
> I'm investigating a performance problem and I don't really know if the
> following stored proc is ok or if it could be written otherwise. One of
> the parameter is a list of eventypes in XML used as filter. If the xml is
> empty, then it means "all events", explaining the reason of the second
> part of the where clause.
> Another way is to use a join with the tempory table but then I still need
> to deal with the "all events" case with a UNION or a test on the emptiness
> of the temporary table
> My question is : is it ok as it is written even in case of large tables ?
> How could it be written otherwise ?
>
For performance it is generally better to test for the "all events" case
with an IF statement and run a different query. That way the "all events"
case and the "some events" case can be compiled and optimized seperately.
When you cram both into one query you will probably get a pretty expensive
plan.
David|||Christian Staffe (x@.y.z) writes:
> I'm investigating a performance problem and I don't really know if the
> following stored proc is ok or if it could be written otherwise. One of
> the parameter is a list of eventypes in XML used as filter. If the xml
> is empty, then it means "all events", explaining the reason of the
> second part of the where clause.
> Another way is to use a join with the tempory table but then I still
> need to deal with the "all events" case with a UNION or a test on the
> emptiness of the temporary table
> My question is : is it ok as it is written even in case of large tables ?
> How could it be written otherwise ?
That NOT EXISTS bit certainly looks weird. Simpler would be:
SELECT EVNT_RT_EVTY AS EvntType
,ISNULL(PROD_CUSTNAME, PROD_RT_PRTY) AS ProdName
FROM Event AS RMCEvent
INNER JOIN Product ON ((PROD_ID = EVNT_LT_PROD) AND (PROD_ID = @.NetID))
WHERE (EXISTS ( -- Filter on event types
SELECT 1
FROM @.RMC_EventTypes AS RMC_EventTypes
WHERE (RMCEvent.EVNT_RT_EVTY = RMC_EventTypes.EVNT_RT_EVTY)
)
OR NOT EXISTS (SELECT 1 FROM @.RMC_EventTypes)
ORDER BY EVNT_TIMESTAMP DESC
FOR XML AUTO, ELEMENTS
But I don't know what effects that would have on performance.
Are the event types defined in a lookup table somewhere? How many
are they? One alternative would be to fill the table variable with
from the lookup table if the XML document is empty.
Best bet for performance, though, is to have separate queries for
the two cases.
Also, I don't know would would happen if you replaced SET ROWCOUNT
with a SELECT TOP.
Erland Sommarskog, SQL Server MVP, esquel@.sommarskog.se
Books Online for SQL Server 2005 at
http://www.microsoft.com/technet/pr...oads/books.mspx
Books Online for SQL Server 2000 at
http://www.microsoft.com/sql/prodin...ions/books.mspx|||OK, here is the solution I came with following the two comments that were
made here. I hope it's ok (and better !)
Side question : after the result set, I also receive this when the xml is
empty, that is, when passing '<EventTypes/> :
[Microsoft][ODBC SQL Server Driver]Unknown token received from SQL Server
Can anyone tell me what it means ?
CREATE PROCEDURE [dbo].[anet2]
@.NetID uniqueidentifier
,@.EventTypesXML ntext
AS
--=====================
-- Prepare event types
--=====================
DECLARE @.docEventTypes int
EXEC sp_xml_preparedocument @.docEventTypes OUTPUT, @.EventTypesXML
SET NOCOUNT ON
DECLARE @.RMC_EventTypes TABLE(
EVNT_RT_EVTY smallint NOT NULL)
INSERT INTO @.RMC_EventTypes
SELECT EVNT_RT_EVTY
FROM OPENXML (@.docEventTypes, 'EventTypes/EventType')
WITH (EVNT_RT_EVTY smallint '@.number')
SET NOCOUNT OFF
EXEC sp_xml_removedocument @.docEventTypes
--=========================
-- Retrieve requested data
--=========================
SET ROWCOUNT 100
IF NOT EXISTS (SELECT 1 FROM @.RMC_EventTypes)
BEGIN
SELECT RMCEvent.EVNT_RT_EVTY AS EvntType
,ISNULL(PROD_CUSTNAME, PROD_RT_PRTY) AS ProdName
FROM Event AS RMCEvent
INNER JOIN Product ON ((PROD_ID = EVNT_LT_PROD) AND (PROD_ID = @.NetID))
ORDER BY EVNT_TIMESTAMP DESC
FOR XML AUTO, ELEMENTS
END
ELSE
BEGIN
SELECT RMCEvent.EVNT_RT_EVTY AS EvntType
,ISNULL(PROD_CUSTNAME, PROD_RT_PRTY) AS ProdName
FROM Event AS RMCEvent
INNER JOIN Product ON ((PROD_ID = EVNT_LT_PROD) AND (PROD_ID = @.NetID))
INNER JOIN @.RMC_EventTypes ET ON ET.EVNT_RT_EVTY = RMCEvent.EVNT_RT_EVTY
ORDER BY EVNT_TIMESTAMP DESC
FOR XML AUTO, ELEMENTS
END
SET ROWCOUNT 0
"Erland Sommarskog" <esquel@.sommarskog.se> wrote in message
news:Xns974CA18CE3122Yazorman@.127.0.0.1...
> Christian Staffe (x@.y.z) writes:
> That NOT EXISTS bit certainly looks weird. Simpler would be:
> SELECT EVNT_RT_EVTY AS EvntType
> ,ISNULL(PROD_CUSTNAME, PROD_RT_PRTY) AS ProdName
> FROM Event AS RMCEvent
> INNER JOIN Product ON ((PROD_ID = EVNT_LT_PROD) AND (PROD_ID = @.NetID))
> WHERE (EXISTS ( -- Filter on event types
> SELECT 1
> FROM @.RMC_EventTypes AS RMC_EventTypes
> WHERE (RMCEvent.EVNT_RT_EVTY = RMC_EventTypes.EVNT_RT_EVTY)
> )
> OR NOT EXISTS (SELECT 1 FROM @.RMC_EventTypes)
> ORDER BY EVNT_TIMESTAMP DESC
> FOR XML AUTO, ELEMENTS
> But I don't know what effects that would have on performance.
> Are the event types defined in a lookup table somewhere? How many
> are they? One alternative would be to fill the table variable with
> from the lookup table if the XML document is empty.
> Best bet for performance, though, is to have separate queries for
> the two cases.
> Also, I don't know would would happen if you replaced SET ROWCOUNT
> with a SELECT TOP.
> --
> Erland Sommarskog, SQL Server MVP, esquel@.sommarskog.se
> Books Online for SQL Server 2005 at
> http://www.microsoft.com/technet/pr...oads/books.mspx
> Books Online for SQL Server 2000 at
> http://www.microsoft.com/sql/prodin...ions/books.mspx|||Christian Staffe (x@.y.z) writes:
> OK, here is the solution I came with following the two comments that were
> made here. I hope it's ok (and better !)
> Side question : after the result set, I also receive this when the xml is
> empty, that is, when passing '<EventTypes/> :
> [Microsoft][ODBC SQL Server Driver]Unknown token received from SQL Server
> Can anyone tell me what it means ?
This means that the client got something from SQL Server that did not
comply to the specification of the TDS specification. (TDS is the protocol
that client APIs use to speak with SQL Server.) Or that the client API
is confusion of TDS.
That is, a bug in SQL Server or in the ODBC SQL Server Driver.
Sometimes this message indicates that there was a crash on the SQL
Server side. Have a look at the SQL Server error log, and see if there
is a stack dump that can be correlated with this message.
Unfortunately, the only way to resolve this issue is to change the
procedure to narrow down exactly what causes it. I would first try
removing SET ROWCOUNT 100.
(OK, there is one more way: try applying the latest service pack, in
case the issue has been fixed.)
Erland Sommarskog, SQL Server MVP, esquel@.sommarskog.se
Books Online for SQL Server 2005 at
http://www.microsoft.com/technet/pr...oads/books.mspx
Books Online for SQL Server 2000 at
http://www.microsoft.com/sql/prodin...ions/books.mspx

Friday, March 9, 2012

Performance of SQL Server Stored Proc slows down 300% over a few weeks

Hi
I have a problem on my solution where a stored procedure that normally takes
about 200ms to execute will, over a period of about 1 to 2 weeks, gradually
slow down to over a second. Eventially, it'll even slow down to 3 seconds to
execute.
I have tried defragmenting indexes associated with tables in the query, I've
even dropped and recreated the indexes. This has a small effect but not
dramatic. As an attempt to halt the degradation in performance, I did create
a scheduled job to execute the DBCC INDEXDEFRAG command on a daily basis but
it has not stopped the execution time of the sp from degrading.
I have found that restarting the SQL Server will always bring the execution
time back down to around 200ms, but it is only a temporary solution. In a
week or two, the performance will have dropped it to 1 second again, as
described above.
I'm a little bit stumped. Has anyone else seen this or know how to stop it?
Thanks
AdrianWhat Service Pack? Any other apps running on that box? Memory settings?
"Adrian Dams" <adriandams@.yahoo.com> wrote in message
news:e5QHDn9LFHA.1528@.TK2MSFTNGP09.phx.gbl...
> Hi
> I have a problem on my solution where a stored procedure that normally
> takes about 200ms to execute will, over a period of about 1 to 2 weeks,
> gradually slow down to over a second. Eventially, it'll even slow down to
> 3 seconds to execute.
> I have tried defragmenting indexes associated with tables in the query,
> I've even dropped and recreated the indexes. This has a small effect but
> not dramatic. As an attempt to halt the degradation in performance, I did
> create a scheduled job to execute the DBCC INDEXDEFRAG command on a daily
> basis but it has not stopped the execution time of the sp from degrading.
> I have found that restarting the SQL Server will always bring the
> execution time back down to around 200ms, but it is only a temporary
> solution. In a week or two, the performance will have dropped it to 1
> second again, as described above.
> I'm a little bit stumped. Has anyone else seen this or know how to stop
> it?
> Thanks
> Adrian
>|||SQL 2000 SP1. The machine is running quite a few other apps as well but the
machine has 1G on it. The machine doesn't seem to be starving for memory.
"Michael C#" <xyz@.yomomma.com> wrote in message
news:uvM$CB$LFHA.3500@.TK2MSFTNGP14.phx.gbl...
> What Service Pack? Any other apps running on that box? Memory settings?
> "Adrian Dams" <adriandams@.yahoo.com> wrote in message
> news:e5QHDn9LFHA.1528@.TK2MSFTNGP09.phx.gbl...
>|||You might want to look at installing SP 3a. There are a lot of reasons to
update, including security and some fixes that address issues such as memory
leaks.
How much memory is SQL server using? How do your memory settings look in
SQL? Are you using a fixed memory setting or allowing SQL to dynamically
allocate memory? Even though SQL doesn't appear to be starving for memory
right now, you might want to monitor memory usage of SQL and your other
apps; particularly during the slow-downs.
"Adrian Dams" <adriandams@.yahoo.com> wrote in message
news:eFlhiO$LFHA.3420@.tk2msftngp13.phx.gbl...
> SQL 2000 SP1. The machine is running quite a few other apps as well but
> the machine has 1G on it. The machine doesn't seem to be starving for
> memory.
>
> "Michael C#" <xyz@.yomomma.com> wrote in message
> news:uvM$CB$LFHA.3500@.TK2MSFTNGP14.phx.gbl...
>|||In addition to the other posts, you might want to check the execution plans
between the different
execution times.
Tibor Karaszi, SQL Server MVP
http://www.karaszi.com/sqlserver/default.asp
http://www.solidqualitylearning.com/
http://www.sqlug.se/
"Adrian Dams" <adriandams@.yahoo.com> wrote in message news:e5QHDn9LFHA.1528@.TK2MSFTNGP09.phx
.gbl...
> Hi
> I have a problem on my solution where a stored procedure that normally tak
es about 200ms to
> execute will, over a period of about 1 to 2 weeks, gradually slow down to
over a second.
> Eventially, it'll even slow down to 3 seconds to execute.
> I have tried defragmenting indexes associated with tables in the query, I'
ve even dropped and
> recreated the indexes. This has a small effect but not dramatic. As an att
empt to halt the
> degradation in performance, I did create a scheduled job to execute the DB
CC INDEXDEFRAG command
> on a daily basis but it has not stopped the execution time of the sp from
degrading.
> I have found that restarting the SQL Server will always bring the executio
n time back down to
> around 200ms, but it is only a temporary solution. In a week or two, the p
erformance will have
> dropped it to 1 second again, as described above.
> I'm a little bit stumped. Has anyone else seen this or know how to stop it
?
> Thanks
> Adrian
>|||Thanks for the response.
Originally the sql server was set to dynamically allocate memory. The
machine had 500M and the sql server had used up about 220M.
I changed the memory settings so that the SQL Server would only use 150M
fixed. The odd thing is that the SQL Server didn't use 220M as it had
previously done but it did consume 180M and stabilise there - more than
150M!!
I will certainly apply SP3a and see how that helps
Thanks again
Adrian
"Michael C#" <xyz@.yomomma.com> wrote in message
news:evxNzu$LFHA.3812@.TK2MSFTNGP10.phx.gbl...
> You might want to look at installing SP 3a. There are a lot of reasons to
> update, including security and some fixes that address issues such as
> memory leaks.
> How much memory is SQL server using? How do your memory settings look in
> SQL? Are you using a fixed memory setting or allowing SQL to dynamically
> allocate memory? Even though SQL doesn't appear to be starving for memory
> right now, you might want to monitor memory usage of SQL and your other
> apps; particularly during the slow-downs.
> "Adrian Dams" <adriandams@.yahoo.com> wrote in message
> news:eFlhiO$LFHA.3420@.tk2msftngp13.phx.gbl...
>|||I like to stop and re-start the SQL Server service to ensure new memory
settings kick in. I don't know exactly what your configuration is or what
apps you have on that box, but I would definitely take a look at whether or
not you really need those other apps running on the same box; and if not,
move them somewhere else. SQL is resource-hungry, and the less other stuff
you have on that box the better off you'll be. Speaking of which, make sure
you have plenty of hard drive space on that box.
"Adrian Dams" <adriandams@.yahoo.com> wrote in message
news:eRIQbJKMFHA.1096@.tk2msftngp13.phx.gbl...
> Thanks for the response.
> Originally the sql server was set to dynamically allocate memory. The
> machine had 500M and the sql server had used up about 220M.
> I changed the memory settings so that the SQL Server would only use 150M
> fixed. The odd thing is that the SQL Server didn't use 220M as it had
> previously done but it did consume 180M and stabilise there - more than
> 150M!!
> I will certainly apply SP3a and see how that helps
> Thanks again
> Adrian
> "Michael C#" <xyz@.yomomma.com> wrote in message
> news:evxNzu$LFHA.3812@.TK2MSFTNGP10.phx.gbl...
>|||if it is the case that your stored proc performs to expectations after
a reboot, then you probably want to add a "with recompile" at the top
of the stored proc, so that it will generate a new execution plan each
time it fires. the execution plan is probably falling out of scope over
time and causing table scans. when you restart sql server, the proc
cache is cleared, so a new one is made, hence the better performance.
if you cannot alter the stored proc, then you have two alternatives:
first, you can schedule a job that runs sp_recompile "proc_name"...do
this once a day and you should be fine.
second, you can pass the "with recompile" as a parameter to your stored
proc...this is usually done when the parameter value is atypical or
the data has changed significantly.
hth,
hans|||Thanks for all your suggestions. I will certainly try them and report back
on the progress
Adrian
"Hans Nelsen" <hnelsen@.owh.com> wrote in message
news:1111705078.992925.138370@.z14g2000cwz.googlegroups.com...
> if it is the case that your stored proc performs to expectations after
> a reboot, then you probably want to add a "with recompile" at the top
> of the stored proc, so that it will generate a new execution plan each
> time it fires. the execution plan is probably falling out of scope over
> time and causing table scans. when you restart sql server, the proc
> cache is cleared, so a new one is made, hence the better performance.
> if you cannot alter the stored proc, then you have two alternatives:
> first, you can schedule a job that runs sp_recompile "proc_name"...do
> this once a day and you should be fine.
> second, you can pass the "with recompile" as a parameter to your stored
> proc...this is usually done when the parameter value is atypical or
> the data has changed significantly.
> hth,
> hans
>

Performance of SQL Server Stored Proc slows down 300% over a few weeks

Hi
I have a problem on my solution where a stored procedure that normally takes
about 200ms to execute will, over a period of about 1 to 2 weeks, gradually
slow down to over a second. Eventially, it'll even slow down to 3 seconds to
execute.
I have tried defragmenting indexes associated with tables in the query, I've
even dropped and recreated the indexes. This has a small effect but not
dramatic. As an attempt to halt the degradation in performance, I did create
a scheduled job to execute the DBCC INDEXDEFRAG command on a daily basis but
it has not stopped the execution time of the sp from degrading.
I have found that restarting the SQL Server will always bring the execution
time back down to around 200ms, but it is only a temporary solution. In a
week or two, the performance will have dropped it to 1 second again, as
described above.
I'm a little bit stumped. Has anyone else seen this or know how to stop it?
Thanks
Adrian
What Service Pack? Any other apps running on that box? Memory settings?
"Adrian Dams" <adriandams@.yahoo.com> wrote in message
news:e5QHDn9LFHA.1528@.TK2MSFTNGP09.phx.gbl...
> Hi
> I have a problem on my solution where a stored procedure that normally
> takes about 200ms to execute will, over a period of about 1 to 2 weeks,
> gradually slow down to over a second. Eventially, it'll even slow down to
> 3 seconds to execute.
> I have tried defragmenting indexes associated with tables in the query,
> I've even dropped and recreated the indexes. This has a small effect but
> not dramatic. As an attempt to halt the degradation in performance, I did
> create a scheduled job to execute the DBCC INDEXDEFRAG command on a daily
> basis but it has not stopped the execution time of the sp from degrading.
> I have found that restarting the SQL Server will always bring the
> execution time back down to around 200ms, but it is only a temporary
> solution. In a week or two, the performance will have dropped it to 1
> second again, as described above.
> I'm a little bit stumped. Has anyone else seen this or know how to stop
> it?
> Thanks
> Adrian
>
|||SQL 2000 SP1. The machine is running quite a few other apps as well but the
machine has 1G on it. The machine doesn't seem to be starving for memory.
"Michael C#" <xyz@.yomomma.com> wrote in message
news:uvM$CB$LFHA.3500@.TK2MSFTNGP14.phx.gbl...
> What Service Pack? Any other apps running on that box? Memory settings?
> "Adrian Dams" <adriandams@.yahoo.com> wrote in message
> news:e5QHDn9LFHA.1528@.TK2MSFTNGP09.phx.gbl...
>
|||You might want to look at installing SP 3a. There are a lot of reasons to
update, including security and some fixes that address issues such as memory
leaks.
How much memory is SQL server using? How do your memory settings look in
SQL? Are you using a fixed memory setting or allowing SQL to dynamically
allocate memory? Even though SQL doesn't appear to be starving for memory
right now, you might want to monitor memory usage of SQL and your other
apps; particularly during the slow-downs.
"Adrian Dams" <adriandams@.yahoo.com> wrote in message
news:eFlhiO$LFHA.3420@.tk2msftngp13.phx.gbl...
> SQL 2000 SP1. The machine is running quite a few other apps as well but
> the machine has 1G on it. The machine doesn't seem to be starving for
> memory.
>
> "Michael C#" <xyz@.yomomma.com> wrote in message
> news:uvM$CB$LFHA.3500@.TK2MSFTNGP14.phx.gbl...
>
|||In addition to the other posts, you might want to check the execution plans between the different
execution times.
Tibor Karaszi, SQL Server MVP
http://www.karaszi.com/sqlserver/default.asp
http://www.solidqualitylearning.com/
http://www.sqlug.se/
"Adrian Dams" <adriandams@.yahoo.com> wrote in message news:e5QHDn9LFHA.1528@.TK2MSFTNGP09.phx.gbl...
> Hi
> I have a problem on my solution where a stored procedure that normally takes about 200ms to
> execute will, over a period of about 1 to 2 weeks, gradually slow down to over a second.
> Eventially, it'll even slow down to 3 seconds to execute.
> I have tried defragmenting indexes associated with tables in the query, I've even dropped and
> recreated the indexes. This has a small effect but not dramatic. As an attempt to halt the
> degradation in performance, I did create a scheduled job to execute the DBCC INDEXDEFRAG command
> on a daily basis but it has not stopped the execution time of the sp from degrading.
> I have found that restarting the SQL Server will always bring the execution time back down to
> around 200ms, but it is only a temporary solution. In a week or two, the performance will have
> dropped it to 1 second again, as described above.
> I'm a little bit stumped. Has anyone else seen this or know how to stop it?
> Thanks
> Adrian
>
|||Thanks for the response.
Originally the sql server was set to dynamically allocate memory. The
machine had 500M and the sql server had used up about 220M.
I changed the memory settings so that the SQL Server would only use 150M
fixed. The odd thing is that the SQL Server didn't use 220M as it had
previously done but it did consume 180M and stabilise there - more than
150M!!
I will certainly apply SP3a and see how that helps
Thanks again
Adrian
"Michael C#" <xyz@.yomomma.com> wrote in message
news:evxNzu$LFHA.3812@.TK2MSFTNGP10.phx.gbl...
> You might want to look at installing SP 3a. There are a lot of reasons to
> update, including security and some fixes that address issues such as
> memory leaks.
> How much memory is SQL server using? How do your memory settings look in
> SQL? Are you using a fixed memory setting or allowing SQL to dynamically
> allocate memory? Even though SQL doesn't appear to be starving for memory
> right now, you might want to monitor memory usage of SQL and your other
> apps; particularly during the slow-downs.
> "Adrian Dams" <adriandams@.yahoo.com> wrote in message
> news:eFlhiO$LFHA.3420@.tk2msftngp13.phx.gbl...
>
|||I like to stop and re-start the SQL Server service to ensure new memory
settings kick in. I don't know exactly what your configuration is or what
apps you have on that box, but I would definitely take a look at whether or
not you really need those other apps running on the same box; and if not,
move them somewhere else. SQL is resource-hungry, and the less other stuff
you have on that box the better off you'll be. Speaking of which, make sure
you have plenty of hard drive space on that box.
"Adrian Dams" <adriandams@.yahoo.com> wrote in message
news:eRIQbJKMFHA.1096@.tk2msftngp13.phx.gbl...
> Thanks for the response.
> Originally the sql server was set to dynamically allocate memory. The
> machine had 500M and the sql server had used up about 220M.
> I changed the memory settings so that the SQL Server would only use 150M
> fixed. The odd thing is that the SQL Server didn't use 220M as it had
> previously done but it did consume 180M and stabilise there - more than
> 150M!!
> I will certainly apply SP3a and see how that helps
> Thanks again
> Adrian
> "Michael C#" <xyz@.yomomma.com> wrote in message
> news:evxNzu$LFHA.3812@.TK2MSFTNGP10.phx.gbl...
>
|||if it is the case that your stored proc performs to expectations after
a reboot, then you probably want to add a "with recompile" at the top
of the stored proc, so that it will generate a new execution plan each
time it fires. the execution plan is probably falling out of scope over
time and causing table scans. when you restart sql server, the proc
cache is cleared, so a new one is made, hence the better performance.
if you cannot alter the stored proc, then you have two alternatives:
first, you can schedule a job that runs sp_recompile "proc_name"...do
this once a day and you should be fine.
second, you can pass the "with recompile" as a parameter to your stored
proc...this is usually done when the parameter value is atypical or
the data has changed significantly.
hth,
hans
|||Thanks for all your suggestions. I will certainly try them and report back
on the progress
Adrian
"Hans Nelsen" <hnelsen@.owh.com> wrote in message
news:1111705078.992925.138370@.z14g2000cwz.googlegr oups.com...
> if it is the case that your stored proc performs to expectations after
> a reboot, then you probably want to add a "with recompile" at the top
> of the stored proc, so that it will generate a new execution plan each
> time it fires. the execution plan is probably falling out of scope over
> time and causing table scans. when you restart sql server, the proc
> cache is cleared, so a new one is made, hence the better performance.
> if you cannot alter the stored proc, then you have two alternatives:
> first, you can schedule a job that runs sp_recompile "proc_name"...do
> this once a day and you should be fine.
> second, you can pass the "with recompile" as a parameter to your stored
> proc...this is usually done when the parameter value is atypical or
> the data has changed significantly.
> hth,
> hans
>

Performance of SQL Server Stored Proc slows down 300% over a few weeks

Hi
I have a problem on my solution where a stored procedure that normally takes
about 200ms to execute will, over a period of about 1 to 2 weeks, gradually
slow down to over a second. Eventially, it'll even slow down to 3 seconds to
execute.
I have tried defragmenting indexes associated with tables in the query, I've
even dropped and recreated the indexes. This has a small effect but not
dramatic. As an attempt to halt the degradation in performance, I did create
a scheduled job to execute the DBCC INDEXDEFRAG command on a daily basis but
it has not stopped the execution time of the sp from degrading.
I have found that restarting the SQL Server will always bring the execution
time back down to around 200ms, but it is only a temporary solution. In a
week or two, the performance will have dropped it to 1 second again, as
described above.
I'm a little bit stumped. Has anyone else seen this or know how to stop it?
Thanks
AdrianWhat about updating statistics, or recompiling the sp, have you tried?
AMB
"Adrian Dams" wrote:
> Hi
> I have a problem on my solution where a stored procedure that normally takes
> about 200ms to execute will, over a period of about 1 to 2 weeks, gradually
> slow down to over a second. Eventially, it'll even slow down to 3 seconds to
> execute.
> I have tried defragmenting indexes associated with tables in the query, I've
> even dropped and recreated the indexes. This has a small effect but not
> dramatic. As an attempt to halt the degradation in performance, I did create
> a scheduled job to execute the DBCC INDEXDEFRAG command on a daily basis but
> it has not stopped the execution time of the sp from degrading.
> I have found that restarting the SQL Server will always bring the execution
> time back down to around 200ms, but it is only a temporary solution. In a
> week or two, the performance will have dropped it to 1 second again, as
> described above.
> I'm a little bit stumped. Has anyone else seen this or know how to stop it?
> Thanks
> Adrian
>
>|||What Service Pack? Any other apps running on that box? Memory settings?
"Adrian Dams" <adriandams@.yahoo.com> wrote in message
news:e5QHDn9LFHA.1528@.TK2MSFTNGP09.phx.gbl...
> Hi
> I have a problem on my solution where a stored procedure that normally
> takes about 200ms to execute will, over a period of about 1 to 2 weeks,
> gradually slow down to over a second. Eventially, it'll even slow down to
> 3 seconds to execute.
> I have tried defragmenting indexes associated with tables in the query,
> I've even dropped and recreated the indexes. This has a small effect but
> not dramatic. As an attempt to halt the degradation in performance, I did
> create a scheduled job to execute the DBCC INDEXDEFRAG command on a daily
> basis but it has not stopped the execution time of the sp from degrading.
> I have found that restarting the SQL Server will always bring the
> execution time back down to around 200ms, but it is only a temporary
> solution. In a week or two, the performance will have dropped it to 1
> second again, as described above.
> I'm a little bit stumped. Has anyone else seen this or know how to stop
> it?
> Thanks
> Adrian
>|||No, I haven't tried that. Thanks for the suggestion. I'll get back to you
with the result
Adrian
"Alejandro Mesa" <AlejandroMesa@.discussions.microsoft.com> wrote in message
news:36EEAAEC-4B19-4325-861A-1130FF81D4F6@.microsoft.com...
> What about updating statistics, or recompiling the sp, have you tried?
>
> AMB
> "Adrian Dams" wrote:
>> Hi
>> I have a problem on my solution where a stored procedure that normally
>> takes
>> about 200ms to execute will, over a period of about 1 to 2 weeks,
>> gradually
>> slow down to over a second. Eventially, it'll even slow down to 3 seconds
>> to
>> execute.
>> I have tried defragmenting indexes associated with tables in the query,
>> I've
>> even dropped and recreated the indexes. This has a small effect but not
>> dramatic. As an attempt to halt the degradation in performance, I did
>> create
>> a scheduled job to execute the DBCC INDEXDEFRAG command on a daily basis
>> but
>> it has not stopped the execution time of the sp from degrading.
>> I have found that restarting the SQL Server will always bring the
>> execution
>> time back down to around 200ms, but it is only a temporary solution. In a
>> week or two, the performance will have dropped it to 1 second again, as
>> described above.
>> I'm a little bit stumped. Has anyone else seen this or know how to stop
>> it?
>> Thanks
>> Adrian
>>|||SQL 2000 SP1. The machine is running quite a few other apps as well but the
machine has 1G on it. The machine doesn't seem to be starving for memory.
"Michael C#" <xyz@.yomomma.com> wrote in message
news:uvM$CB$LFHA.3500@.TK2MSFTNGP14.phx.gbl...
> What Service Pack? Any other apps running on that box? Memory settings?
> "Adrian Dams" <adriandams@.yahoo.com> wrote in message
> news:e5QHDn9LFHA.1528@.TK2MSFTNGP09.phx.gbl...
>> Hi
>> I have a problem on my solution where a stored procedure that normally
>> takes about 200ms to execute will, over a period of about 1 to 2 weeks,
>> gradually slow down to over a second. Eventially, it'll even slow down to
>> 3 seconds to execute.
>> I have tried defragmenting indexes associated with tables in the query,
>> I've even dropped and recreated the indexes. This has a small effect but
>> not dramatic. As an attempt to halt the degradation in performance, I did
>> create a scheduled job to execute the DBCC INDEXDEFRAG command on a daily
>> basis but it has not stopped the execution time of the sp from degrading.
>> I have found that restarting the SQL Server will always bring the
>> execution time back down to around 200ms, but it is only a temporary
>> solution. In a week or two, the performance will have dropped it to 1
>> second again, as described above.
>> I'm a little bit stumped. Has anyone else seen this or know how to stop
>> it?
>> Thanks
>> Adrian
>|||Is it possible to rebuild the stored procedure via script? This problem
exists on a customer site and the slow down in the sp is a big problem for
them. If possible, and if the rebuild works, I could set it up as a
scheduled job. If its not possible, it won't unfortunately work for me as a
solution
Adrian
"Alejandro Mesa" <AlejandroMesa@.discussions.microsoft.com> wrote in message
news:36EEAAEC-4B19-4325-861A-1130FF81D4F6@.microsoft.com...
> What about updating statistics, or recompiling the sp, have you tried?
>
> AMB
> "Adrian Dams" wrote:
>> Hi
>> I have a problem on my solution where a stored procedure that normally
>> takes
>> about 200ms to execute will, over a period of about 1 to 2 weeks,
>> gradually
>> slow down to over a second. Eventially, it'll even slow down to 3 seconds
>> to
>> execute.
>> I have tried defragmenting indexes associated with tables in the query,
>> I've
>> even dropped and recreated the indexes. This has a small effect but not
>> dramatic. As an attempt to halt the degradation in performance, I did
>> create
>> a scheduled job to execute the DBCC INDEXDEFRAG command on a daily basis
>> but
>> it has not stopped the execution time of the sp from degrading.
>> I have found that restarting the SQL Server will always bring the
>> execution
>> time back down to around 200ms, but it is only a temporary solution. In a
>> week or two, the performance will have dropped it to 1 second again, as
>> described above.
>> I'm a little bit stumped. Has anyone else seen this or know how to stop
>> it?
>> Thanks
>> Adrian
>>|||You might want to look at installing SP 3a. There are a lot of reasons to
update, including security and some fixes that address issues such as memory
leaks.
How much memory is SQL server using? How do your memory settings look in
SQL? Are you using a fixed memory setting or allowing SQL to dynamically
allocate memory? Even though SQL doesn't appear to be starving for memory
right now, you might want to monitor memory usage of SQL and your other
apps; particularly during the slow-downs.
"Adrian Dams" <adriandams@.yahoo.com> wrote in message
news:eFlhiO$LFHA.3420@.tk2msftngp13.phx.gbl...
> SQL 2000 SP1. The machine is running quite a few other apps as well but
> the machine has 1G on it. The machine doesn't seem to be starving for
> memory.
>
> "Michael C#" <xyz@.yomomma.com> wrote in message
> news:uvM$CB$LFHA.3500@.TK2MSFTNGP14.phx.gbl...
>> What Service Pack? Any other apps running on that box? Memory settings?
>> "Adrian Dams" <adriandams@.yahoo.com> wrote in message
>> news:e5QHDn9LFHA.1528@.TK2MSFTNGP09.phx.gbl...
>> Hi
>> I have a problem on my solution where a stored procedure that normally
>> takes about 200ms to execute will, over a period of about 1 to 2 weeks,
>> gradually slow down to over a second. Eventially, it'll even slow down
>> to 3 seconds to execute.
>> I have tried defragmenting indexes associated with tables in the query,
>> I've even dropped and recreated the indexes. This has a small effect but
>> not dramatic. As an attempt to halt the degradation in performance, I
>> did create a scheduled job to execute the DBCC INDEXDEFRAG command on a
>> daily basis but it has not stopped the execution time of the sp from
>> degrading.
>> I have found that restarting the SQL Server will always bring the
>> execution time back down to around 200ms, but it is only a temporary
>> solution. In a week or two, the performance will have dropped it to 1
>> second again, as described above.
>> I'm a little bit stumped. Has anyone else seen this or know how to stop
>> it?
>> Thanks
>> Adrian
>>
>|||In addition to the other posts, you might want to check the execution plans between the different
execution times.
--
Tibor Karaszi, SQL Server MVP
http://www.karaszi.com/sqlserver/default.asp
http://www.solidqualitylearning.com/
http://www.sqlug.se/
"Adrian Dams" <adriandams@.yahoo.com> wrote in message news:e5QHDn9LFHA.1528@.TK2MSFTNGP09.phx.gbl...
> Hi
> I have a problem on my solution where a stored procedure that normally takes about 200ms to
> execute will, over a period of about 1 to 2 weeks, gradually slow down to over a second.
> Eventially, it'll even slow down to 3 seconds to execute.
> I have tried defragmenting indexes associated with tables in the query, I've even dropped and
> recreated the indexes. This has a small effect but not dramatic. As an attempt to halt the
> degradation in performance, I did create a scheduled job to execute the DBCC INDEXDEFRAG command
> on a daily basis but it has not stopped the execution time of the sp from degrading.
> I have found that restarting the SQL Server will always bring the execution time back down to
> around 200ms, but it is only a temporary solution. In a week or two, the performance will have
> dropped it to 1 second again, as described above.
> I'm a little bit stumped. Has anyone else seen this or know how to stop it?
> Thanks
> Adrian
>|||Thanks for the response.
Originally the sql server was set to dynamically allocate memory. The
machine had 500M and the sql server had used up about 220M.
I changed the memory settings so that the SQL Server would only use 150M
fixed. The odd thing is that the SQL Server didn't use 220M as it had
previously done but it did consume 180M and stabilise there - more than
150M!!
I will certainly apply SP3a and see how that helps
Thanks again
Adrian
"Michael C#" <xyz@.yomomma.com> wrote in message
news:evxNzu$LFHA.3812@.TK2MSFTNGP10.phx.gbl...
> You might want to look at installing SP 3a. There are a lot of reasons to
> update, including security and some fixes that address issues such as
> memory leaks.
> How much memory is SQL server using? How do your memory settings look in
> SQL? Are you using a fixed memory setting or allowing SQL to dynamically
> allocate memory? Even though SQL doesn't appear to be starving for memory
> right now, you might want to monitor memory usage of SQL and your other
> apps; particularly during the slow-downs.
> "Adrian Dams" <adriandams@.yahoo.com> wrote in message
> news:eFlhiO$LFHA.3420@.tk2msftngp13.phx.gbl...
>> SQL 2000 SP1. The machine is running quite a few other apps as well but
>> the machine has 1G on it. The machine doesn't seem to be starving for
>> memory.
>>
>> "Michael C#" <xyz@.yomomma.com> wrote in message
>> news:uvM$CB$LFHA.3500@.TK2MSFTNGP14.phx.gbl...
>> What Service Pack? Any other apps running on that box? Memory
>> settings?
>> "Adrian Dams" <adriandams@.yahoo.com> wrote in message
>> news:e5QHDn9LFHA.1528@.TK2MSFTNGP09.phx.gbl...
>> Hi
>> I have a problem on my solution where a stored procedure that normally
>> takes about 200ms to execute will, over a period of about 1 to 2 weeks,
>> gradually slow down to over a second. Eventially, it'll even slow down
>> to 3 seconds to execute.
>> I have tried defragmenting indexes associated with tables in the query,
>> I've even dropped and recreated the indexes. This has a small effect
>> but not dramatic. As an attempt to halt the degradation in performance,
>> I did create a scheduled job to execute the DBCC INDEXDEFRAG command on
>> a daily basis but it has not stopped the execution time of the sp from
>> degrading.
>> I have found that restarting the SQL Server will always bring the
>> execution time back down to around 200ms, but it is only a temporary
>> solution. In a week or two, the performance will have dropped it to 1
>> second again, as described above.
>> I'm a little bit stumped. Has anyone else seen this or know how to stop
>> it?
>> Thanks
>> Adrian
>>
>>
>|||I like to stop and re-start the SQL Server service to ensure new memory
settings kick in. I don't know exactly what your configuration is or what
apps you have on that box, but I would definitely take a look at whether or
not you really need those other apps running on the same box; and if not,
move them somewhere else. SQL is resource-hungry, and the less other stuff
you have on that box the better off you'll be. Speaking of which, make sure
you have plenty of hard drive space on that box.
"Adrian Dams" <adriandams@.yahoo.com> wrote in message
news:eRIQbJKMFHA.1096@.tk2msftngp13.phx.gbl...
> Thanks for the response.
> Originally the sql server was set to dynamically allocate memory. The
> machine had 500M and the sql server had used up about 220M.
> I changed the memory settings so that the SQL Server would only use 150M
> fixed. The odd thing is that the SQL Server didn't use 220M as it had
> previously done but it did consume 180M and stabilise there - more than
> 150M!!
> I will certainly apply SP3a and see how that helps
> Thanks again
> Adrian
> "Michael C#" <xyz@.yomomma.com> wrote in message
> news:evxNzu$LFHA.3812@.TK2MSFTNGP10.phx.gbl...
>> You might want to look at installing SP 3a. There are a lot of reasons
>> to update, including security and some fixes that address issues such as
>> memory leaks.
>> How much memory is SQL server using? How do your memory settings look in
>> SQL? Are you using a fixed memory setting or allowing SQL to dynamically
>> allocate memory? Even though SQL doesn't appear to be starving for
>> memory right now, you might want to monitor memory usage of SQL and your
>> other apps; particularly during the slow-downs.
>> "Adrian Dams" <adriandams@.yahoo.com> wrote in message
>> news:eFlhiO$LFHA.3420@.tk2msftngp13.phx.gbl...
>> SQL 2000 SP1. The machine is running quite a few other apps as well but
>> the machine has 1G on it. The machine doesn't seem to be starving for
>> memory.
>>
>> "Michael C#" <xyz@.yomomma.com> wrote in message
>> news:uvM$CB$LFHA.3500@.TK2MSFTNGP14.phx.gbl...
>> What Service Pack? Any other apps running on that box? Memory
>> settings?
>> "Adrian Dams" <adriandams@.yahoo.com> wrote in message
>> news:e5QHDn9LFHA.1528@.TK2MSFTNGP09.phx.gbl...
>> Hi
>> I have a problem on my solution where a stored procedure that normally
>> takes about 200ms to execute will, over a period of about 1 to 2
>> weeks, gradually slow down to over a second. Eventially, it'll even
>> slow down to 3 seconds to execute.
>> I have tried defragmenting indexes associated with tables in the
>> query, I've even dropped and recreated the indexes. This has a small
>> effect but not dramatic. As an attempt to halt the degradation in
>> performance, I did create a scheduled job to execute the DBCC
>> INDEXDEFRAG command on a daily basis but it has not stopped the
>> execution time of the sp from degrading.
>> I have found that restarting the SQL Server will always bring the
>> execution time back down to around 200ms, but it is only a temporary
>> solution. In a week or two, the performance will have dropped it to 1
>> second again, as described above.
>> I'm a little bit stumped. Has anyone else seen this or know how to
>> stop it?
>> Thanks
>> Adrian
>>
>>
>>
>|||if it is the case that your stored proc performs to expectations after
a reboot, then you probably want to add a "with recompile" at the top
of the stored proc, so that it will generate a new execution plan each
time it fires. the execution plan is probably falling out of scope over
time and causing table scans. when you restart sql server, the proc
cache is cleared, so a new one is made, hence the better performance.
if you cannot alter the stored proc, then you have two alternatives:
first, you can schedule a job that runs sp_recompile "proc_name"...do
this once a day and you should be fine.
second, you can pass the "with recompile" as a parameter to your stored
proc...this is usually done when the parameter value is atypical or
the data has changed significantly.
hth,
hans|||Thanks for all your suggestions. I will certainly try them and report back
on the progress
Adrian
"Hans Nelsen" <hnelsen@.owh.com> wrote in message
news:1111705078.992925.138370@.z14g2000cwz.googlegroups.com...
> if it is the case that your stored proc performs to expectations after
> a reboot, then you probably want to add a "with recompile" at the top
> of the stored proc, so that it will generate a new execution plan each
> time it fires. the execution plan is probably falling out of scope over
> time and causing table scans. when you restart sql server, the proc
> cache is cleared, so a new one is made, hence the better performance.
> if you cannot alter the stored proc, then you have two alternatives:
> first, you can schedule a job that runs sp_recompile "proc_name"...do
> this once a day and you should be fine.
> second, you can pass the "with recompile" as a parameter to your stored
> proc...this is usually done when the parameter value is atypical or
> the data has changed significantly.
> hth,
> hans
>

Wednesday, March 7, 2012

Performance of Hardware

Hey all,
I have 2 servers that I am interested in SQL Server 2000. One is a
single proc, Intel 2.53Ghz 533FSB with 1GB of dual-channel memory. This
server is running SBS2k3 running as a domain controller and Exchange. The
other is a dual proc AMD Athlon MP with 1500's in them. This server has
768MB and is running Windows 2003 Enterprise with RRAS acting as the gateway
to the internet, DNS, and DHCP. My question is which is the lesser evil? I
want to try out Microsoft CRM 1.2 and trying to balance out the workload.
Any help would be much appriciated.
MichaelMichael,
Some questions to point you down the right track:
What system has the better disc subsystem?
Check perfmon for current CPU loadings etc?
Mike John
"Michael Sainz" <mike@.twofatfrogs(remove).com> wrote in message =news:efCVtdA1DHA.1272@.TK2MSFTNGP12.phx.gbl...
> Hey all,
> I have 2 servers that I am interested in SQL Server 2000. One is a
> single proc, Intel 2.53Ghz 533FSB with 1GB of dual-channel memory. =This
> server is running SBS2k3 running as a domain controller and Exchange. =The
> other is a dual proc AMD Athlon MP with 1500's in them. This server =has
> 768MB and is running Windows 2003 Enterprise with RRAS acting as the =gateway
> to the internet, DNS, and DHCP. My question is which is the lesser =evil? I
> want to try out Microsoft CRM 1.2 and trying to balance out the =workload.
> Any help would be much appriciated.
> > Michael
> >|||John,
The Intel system (DC, Exchange2k3) has a single 80GB SATA Drive. The AMD
has dual 80GB in a mirrored array (hardware). In terms of performance and
load...there are only 3 clients ever connected so I dont think this is an
issue at all, but I just want to make sure and that is why i'm posting.
Thanks!
"Mike John" <Mike.John@.knowledgepool.spamtrap.com> wrote in message
news:ug7sctJ1DHA.3468@.TK2MSFTNGP11.phx.gbl...
Michael,
Some questions to point you down the right track:
What system has the better disc subsystem?
Check perfmon for current CPU loadings etc?
Mike John
"Michael Sainz" <mike@.twofatfrogs(remove).com> wrote in message
news:efCVtdA1DHA.1272@.TK2MSFTNGP12.phx.gbl...
> Hey all,
> I have 2 servers that I am interested in SQL Server 2000. One is a
> single proc, Intel 2.53Ghz 533FSB with 1GB of dual-channel memory. This
> server is running SBS2k3 running as a domain controller and Exchange. The
> other is a dual proc AMD Athlon MP with 1500's in them. This server has
> 768MB and is running Windows 2003 Enterprise with RRAS acting as the
gateway
> to the internet, DNS, and DHCP. My question is which is the lesser evil? I
> want to try out Microsoft CRM 1.2 and trying to balance out the workload.
> Any help would be much appriciated.
> Michael
>