Wednesday, March 21, 2012

Performance problem?

I'm having performance problems in the following scenario:
I import about 3 million rows into a table with DTS(no indexes because of
the large amount of data). I then run a script with 3 delete and about 25
update stmnts. This takes about 3 hours( Is that long?), I then Truncate the
table and recreate this process(with new data ofcourse). It takes longer and
longer on each iteration. What can i do to improve the performance? Would
dropping and recreating the table or the actual DB help? I've noticed that
the time I save running this script with an index during the import is offset
, since it takes much longer to import that much data with an index. Any
ideas would be greatly appreciated! Thanks!
Also, recovery is set to SIMPLE!
Hi
Why don't you:
Drop the Indexes
Load the Data
Create the Indexes
Run the Update/Deletes
Running deletes and updates against a suitably indexed table is a lot faster.
"mikeb" wrote:

> I'm having performance problems in the following scenario:
> I import about 3 million rows into a table with DTS(no indexes because of
> the large amount of data). I then run a script with 3 delete and about 25
> update stmnts. This takes about 3 hours( Is that long?), I then Truncate the
> table and recreate this process(with new data ofcourse). It takes longer and
> longer on each iteration. What can i do to improve the performance? Would
> dropping and recreating the table or the actual DB help? I've noticed that
> the time I save running this script with an index during the import is offset
> , since it takes much longer to import that much data with an index. Any
> ideas would be greatly appreciated! Thanks!
> Also, recovery is set to SIMPLE!
>

No comments:

Post a Comment