Friday, March 9, 2012

Performance on a laptop vs Server...

I work for a car dealership in MA and am in the process of developing a
database for an eCommerce store that will sell parts for vehicles. My
challenge is that I must create forty tables of â'mask valuesâ' that are used
to map vehicles to their parts. Each table uses a different pattern for its
mask values. An example of these tables using an xxxx0000xxxx0000 pattern is
illustrated below:
4 5 6 7
12 13 14 15
20 21 22 23
â?¦ â?¦ â?¦ â?¦
274,877,906,940 274,877,906,941 274,877,906,942 274,877,906,943(maximum value)
Only values in a master table of 2000 mask values need be written to the
above table. Therefore I wrote a stored procedure that reads from the master
table, figures where the read value is in the pattern, and decides whether
the value should be written.
I donâ't currently have access to a server and will ultimately host the
application on an external server. I am doing my work on a 512MB 2.8GHz
laptop that is running Windows 2000 and the desktop version of SQL Server
2000 Desktop.
My problem is that after letting the stored procedure designed to create the
above table run for 20 hours, it had only reached mask value of
17,171,348,983. Since 40 such stored procedures must be run monthly, this
type of performance does not work.
Would the stored procedure run an order of magnitude faster if run on a
server with a full version of SQL 2000? THANKS!It will probably be faster due to increased memory, disk access speed,
multiple processors...
But you should always design things so that they can run on a lower spec m/c.
The performance you are seeing suggests that this will be unworkable
whatever you do.
It sounds like you are doing sequential processing (do you come from an
application background?) rather than set based and are unlikely to see much
improvement on the server.
"BernardiBob" wrote:
> I work for a car dealership in MA and am in the process of developing a
> database for an eCommerce store that will sell parts for vehicles. My
> challenge is that I must create forty tables of â'mask valuesâ' that are used
> to map vehicles to their parts. Each table uses a different pattern for its
> mask values. An example of these tables using an xxxx0000xxxx0000 pattern is
> illustrated below:
>
> 4 5 6 7
> 12 13 14 15
> 20 21 22 23
> â?¦ â?¦ â?¦ â?¦
> 274,877,906,940 274,877,906,941 274,877,906,942 274,877,906,943(maximum value)
> Only values in a master table of 2000 mask values need be written to the
> above table. Therefore I wrote a stored procedure that reads from the master
> table, figures where the read value is in the pattern, and decides whether
> the value should be written.
> I donâ't currently have access to a server and will ultimately host the
> application on an external server. I am doing my work on a 512MB 2.8GHz
> laptop that is running Windows 2000 and the desktop version of SQL Server
> 2000 Desktop.
> My problem is that after letting the stored procedure designed to create the
> above table run for 20 hours, it had only reached mask value of
> 17,171,348,983. Since 40 such stored procedures must be run monthly, this
> type of performance does not work.
> Would the stored procedure run an order of magnitude faster if run on a
> server with a full version of SQL 2000? THANKS!
>|||In article <AD14BC1B-7A1B-4B6F-9501-653E70A35BA9@.microsoft.com>, BernardiBob@.discussions.microsoft.com says...
> I work for a car dealership in MA and am in the process of developing a= > database for an eCommerce store that will sell parts for vehicles. My > challenge is that I must create forty tables of =E2=A4=3D3Fmask values=E2==A4=3D3F that are used > to map vehicles to their parts. Each table uses a different pattern for i=ts > mask values. An example of these tables using an xxxx0000xxxx0000 pattern= is > illustrated below:
> > > 4=095=096=097
> 12=0913=0914=0915
> 20=0921=0922=0923
> =E2=A4=3D3F=09=E2=A4=3D3F=09=E2=A4=3D3F=09=E2=A4=3D3F
> 274,877,906,940=09274,877,906,941=09274,877,906,942=09274,877,906,943(max=imum value)
> > Only values in a master table of 2000 mask values need be written to the= > above table. Therefore I wrote a stored procedure that reads from the mas=ter > table, figures where the read value is in the pattern, and decides whethe=r > the value should be written.
> > I don=E2=A4=3D3Ft currently have access to a server and will ultimately h=ost the > application on an external server. I am doing my work on a 512MB 2.8GHz= > laptop that is running Windows 2000 and the desktop version of SQL Server= > 2000 Desktop.
> > My problem is that after letting the stored procedure designed to create =the > above table run for 20 hours, it had only reached mask value of > 17,171,348,983. Since 40 such stored procedures must be run monthly, this= > type of performance does not work.
> > Would the stored procedure run an order of magnitude faster if run on a= > server with a full version of SQL 2000? THANKS!
Laptops normally use slower CPU's, slower Drives, and generally are less powerful than a workstation - there are some exceptions, such as the P4 3.2ghz Hyper-Threaded Toshiba units with 7200RPM drives....
Where you're getting slowed down, other than your approach, is in the hardware:
1 slow drive vs 5 x fast drives in a RAID5 setup
1 Transaction logs and data on single drive vs separate drives.
1 slow CPU vs 2 x fast, Xeon, Hyper-Threadde CPU's
1 OS that's a desktop OS vs Server based OS that can better utilize Xeon's / HT
1 base memory vs LOTS of RAM in a server.
The above items separate a laptop/workstation from a server class computer.
Your coding / approach may also be a determining factor in performance, but the hardware WILL make a difference.
-- -- spam999free@.rrohio.com
remove 999 in order to email me|||>I work for a car dealership in MA and am in the process of developing a
> database for an eCommerce store that will sell parts for vehicles. My
> challenge is that I must create forty tables of 'mask values' that are
> used
> to map vehicles to their parts. Each table uses a different pattern for
> its
> mask values. An example of these tables using an xxxx0000xxxx0000 pattern
> is
> illustrated below:
>
> 4 5 6 7
> 12 13 14 15
> 20 21 22 23
> . . . .
> 274,877,906,940 274,877,906,941 274,877,906,942 274,877,906,943(maximum
> value)
>
This seems a very bizarre and inefficient design. Why can't you map vehicles
to their parts with a joining table using atomic columns. Like:
CREATE TABLE ModelParts (model_no VARCHAR(10) NOT NULL REFERENCES Models
(model_no), part_no VARCHAR(10) NOT NULL REFERENCES Parts (part_no), PRIMARY
KEY (model_no, part_no))
--
David Portas
SQL Server MVP
--|||Hi
You may want to check your HDD fragmentation as this may significantly
effect SQL Server performance.
I would also concur with David's concerns regarding design.
John
"BernardiBob" <BernardiBob@.discussions.microsoft.com> wrote in message
news:AD14BC1B-7A1B-4B6F-9501-653E70A35BA9@.microsoft.com...
>I work for a car dealership in MA and am in the process of developing a
> database for an eCommerce store that will sell parts for vehicles. My
> challenge is that I must create forty tables of 'mask values' that are
> used
> to map vehicles to their parts. Each table uses a different pattern for
> its
> mask values. An example of these tables using an xxxx0000xxxx0000 pattern
> is
> illustrated below:
>
> 4 5 6 7
> 12 13 14 15
> 20 21 22 23
> . . . .
> 274,877,906,940 274,877,906,941 274,877,906,942 274,877,906,943(maximum
> value)
> Only values in a master table of 2000 mask values need be written to the
> above table. Therefore I wrote a stored procedure that reads from the
> master
> table, figures where the read value is in the pattern, and decides whether
> the value should be written.
> I don't currently have access to a server and will ultimately host the
> application on an external server. I am doing my work on a 512MB 2.8GHz
> laptop that is running Windows 2000 and the desktop version of SQL Server
> 2000 Desktop.
> My problem is that after letting the stored procedure designed to create
> the
> above table run for 20 hours, it had only reached mask value of
> 17,171,348,983. Since 40 such stored procedures must be run monthly, this
> type of performance does not work.
> Would the stored procedure run an order of magnitude faster if run on a
> server with a full version of SQL 2000? THANKS!
>|||Thank you all for your feedback. I am not familiar with set processing and
will look into it. If you have a reference it would be greatly appreciated. I
did defrag my hard drive before beginning the process. I concure with the
comments on the design. Unfortunately the design is not mine but that of the
vehicle manufacturer.
Regards,
-bob|||> Unfortunately the design is not mine but that of the
> vehicle manufacturer.
That seems at odds with your original statement that you are developing a
database and you intend to create the tables. Presumably what you mean here
is that the manufacturer requires the data is *presented* to them in a
particular format. That doesn't mean it should be *stored* in that format.
It probably won't make sense to store it in the format you described if
performance is a requirement.
Set-based processing just means declarative SQL code that operates on whole
sets of rows rather than one row at a time. Basically the SELECT, UPDATE,
INSERT, DELETE statements. Your narrative suggested that you were using a
loop that operated row by row - almost always a much less efficient way to
process data.
--
David Portas
SQL Server MVP
--|||On Sun, 22 May 2005 14:13:13 -0700, BernardiBob wrote:
>Thank you all for your feedback. I am not familiar with set processing and
>will look into it. If you have a reference it would be greatly appreciated. I
>did defrag my hard drive before beginning the process. I concure with the
>comments on the design. Unfortunately the design is not mine but that of the
>vehicle manufacturer.
>Regards,
>-bob
Hi Bob,
I think that the only way to improve the speedd of your process
sufficiently is a rewrite of the stored procedure. Using better hardware
can speed up your process, but only linear - I think you need an
exponential speed increase here to make it runnable.
Please check out the information on www.aspfaq.com/5006. It describes
what information we need (and how you can assemble it) to help us help
you: table structure, sample data and expected output. In this case,
posting the code of your current stored procedure would robably help as
well.
BTW, from your original post, I got the impression that you are actually
storing all rows for all theoretically possible values from 0 up to the
maximum value (274 billion and some). If that impression is correct,
then I think that you'll need a redesign as well.
Best, Hugo
--
(Remove _NO_ and _SPAM_ to get my e-mail address)|||You suggestion was very helpful. By changing the bulk of the logic from the
stroed procedure to a dataset based .net module I was able to improve the
performance by an order of magniture.
I appreciate the suggestions of others that the design was poor. The design
is of the mapping tables was necessitated by the manufacturer's design. They
provided tables of vehicles and tables of parts along with remote keys. The
tables were provided for a softare application that they also provided -
whihc has nothing to do with the app that I was designing.
Without getting into details of their design, additional tables were
required to map the parts remote keys to the vehicle remote keys. These
tables were omitted by the manufactuere because they were proprietary and
were generated by their 'black-box' software which we lacked access to. Thus,
I am stuck with their design - which results in almost instant respose time
for the final application.
"Nigel Rivett" wrote:
> It will probably be faster due to increased memory, disk access speed,
> multiple processors...
> But you should always design things so that they can run on a lower spec m/c.
> The performance you are seeing suggests that this will be unworkable
> whatever you do.
> It sounds like you are doing sequential processing (do you come from an
> application background?) rather than set based and are unlikely to see much
> improvement on the server.
> "BernardiBob" wrote:
> > I work for a car dealership in MA and am in the process of developing a
> > database for an eCommerce store that will sell parts for vehicles. My
> > challenge is that I must create forty tables of â'mask valuesâ' that are used
> > to map vehicles to their parts. Each table uses a different pattern for its
> > mask values. An example of these tables using an xxxx0000xxxx0000 pattern is
> > illustrated below:
> >
> >
> > 4 5 6 7
> > 12 13 14 15
> > 20 21 22 23
> > â?¦ â?¦ â?¦ â?¦
> > 274,877,906,940 274,877,906,941 274,877,906,942 274,877,906,943(maximum value)
> >
> > Only values in a master table of 2000 mask values need be written to the
> > above table. Therefore I wrote a stored procedure that reads from the master
> > table, figures where the read value is in the pattern, and decides whether
> > the value should be written.
> >
> > I donâ't currently have access to a server and will ultimately host the
> > application on an external server. I am doing my work on a 512MB 2.8GHz
> > laptop that is running Windows 2000 and the desktop version of SQL Server
> > 2000 Desktop.
> >
> > My problem is that after letting the stored procedure designed to create the
> > above table run for 20 hours, it had only reached mask value of
> > 17,171,348,983. Since 40 such stored procedures must be run monthly, this
> > type of performance does not work.
> >
> > Would the stored procedure run an order of magnitude faster if run on a
> > server with a full version of SQL 2000? THANKS!
> >

No comments:

Post a Comment