Friday, March 30, 2012

Performance tuning issues

I have built a solution which runs for two hours on a server with 4CPU 2GHz each and 2GB of RAM on windows 2000 server (CPU utilization almost 70% and almost out of RAM). I moved the two source databases and the solution to a new box runing 8 xeon's at 3GHz each and 16GB of RAM running widows 2003 server 32bit and it still runs for 2 hours (CPU utilization 10% and ample RAM left).

I was expecting it to run much faster. So I started exploring the performance tuning features in SSIS and started tweaking the following:

Control Flow:

DefaultBufferMaxRows: Based on row size and buffer size, calculated the max rows.

DefaultBufferSize: Set this to max 100MB

DataFlow Destination:

Rows Per Batch: Set this to equal to the numbe of rows expected from the source.

Maximum Insert Commit Size: Set this to zero since memory was not an issue

I took the recommendations from other threads on similar issues here including the excellent recommendations at http://www.microsoft.com/technet/prodtechnol/sql/2005/ssisperf.mspx but now the job is running for 6 hours.

Can anyone explain what I am doing wrong here? I have tried each of the above one by one and all together. No matter what combination I try it does not work any faster and both source and destination database are on the same server. Even selects from the same database also slowed down from 10 minutes to one hour.

Any assistance is appreciated, I need to get this job run in an hour.

Thanks!

- Philips.

How complex is your solution? I would recommend you to look at a lower grain;take a look a the log execution and compare it againt previous logs in the old server to see if you can identify a specifc part of the process as the bottleneck...|||

I'd also recommend watching the OVAL webcast that I talk about here:

Donald Farmer's Technet webcast
(http://blogs.conchango.com/jamiethomson/archive/2006/06/14/4076.aspx)

-Jamie

|||Did you setup the server to access more than 4gb of memory?

You need to configure Windows and then SQL server.|||Yes, SQL server is using around 14GB of memory and awe is turned on. Thanks!|||

It is a financial warehouse job which collects data from an ERP system loads a staging area and then the datamart. It also creates aggreagate tables. There are around 8 packages called from the master package.

My problem is I cannot find any way of using those four performance tuning settings accurately.

Any changes I make to the default setting is slowing the job down.

|||

Thanks! I went through it.

I am at a point where I am willing to drop all the control flow task and use plain old insert into statements and use the power of the DBMS engine than try to get the SSIS engine to do it. Unless I figure out what is wrong and why.

|||

Philips-HCR wrote:

Thanks! I went through it.

I am at a point where I am willing to drop all the control flow task and use plain old insert into statements and use the power of the DBMS engine than try to get the SSIS engine to do it. Unless I figure out what is wrong and why.

There's nothing wrong with doing that. The use of SSIS does not dictate that you should use data-flows to move your data about. If SQL is an option then invariably it will be the best option. it depends on your requirements and your preferences. You can issue SQL from an Execute SQL Task and still leverage all the other good stuff in SSIS like logging, workflow, portability etc... if you so wish.

-Jamie

No comments:

Post a Comment