Continuing question about my MS Access to SQL conversion project :
0)Many thanks for previous assistance - we now have the batch process
running successfully on the clients network
1) It's an MS Access batch process using approximately 150 tables, 300
queries - sounds like a mess but it is a reasonably disciplined and
structured application - dealing with real world, very noisy data from
3 sources, massaging them in to a unified set of data, recording data
over a 25 year period.
2) The client does not want to pay for a full conversion to SQL - ie
convert all of MS Access code to stored procedures - just wants the
back end across to SQL for use with other tools (such as Cognos)
3) The batch process takes 12 hours on my little development
environment and 24 hours plus on their corporate network
(the original pure MS Access batch process take 45 minutes on my
network and 6 hours on theirs)
4) The client is now starting to understand the need to move some of
the processing in to the SQL server. For example - I have experimented
and found that a delete query on an intermediate work table will take
10 minutes via the Access front end, but 5 seconds as a pass thru
query. Also, some queries are taking 100 minutes to run across the
network, and if I focus attention on turning these in to pass thru
queries - I am sure I can drastically speed them up.
QUESTION
Before I put the client to the expense of additional development - are
there any other steps I should follow first - ie are there any
settings I should check on the SQL server.
For example - I don't need transaction logging at all - its really a
single user, batch application - if it crashes half way thru we can
just restart it and it is built in such a way that it will sort itself
out.
I have to confess that I only know enough about SQL server to be
dangerous - so if anyone could point me at some topics - I will go off
and do some research - but at the moment I don't know where to start.
Many thanks
Tony
You can't turn off logging completely but you can under the right conditions
do a "minimally logged load" Look in BooksOnLine under that topic for more
details. But this will only help with loading of tables and not
manipulating the data once there. Are you using SET NOCOUNT ON in all your
batches? I have no clue as to what they are really doing but if you are
trying to issue Deletes etc. via a gui when you can simply pass the query in
you are definitely going to slow things down. I suspect there are lots of
things you can do to speed things up but without knowing more about exactly
what you are doing and how it's pretty hard to say.
Andrew J. Kelly SQL MVP
<ace join_to ware@.iinet.net.au (Tony Epton)> wrote in message
news:41cb69e9.7622515@.news.m.iinet.net.au...
> Continuing question about my MS Access to SQL conversion project :
> 0)Many thanks for previous assistance - we now have the batch process
> running successfully on the clients network
> 1) It's an MS Access batch process using approximately 150 tables, 300
> queries - sounds like a mess but it is a reasonably disciplined and
> structured application - dealing with real world, very noisy data from
> 3 sources, massaging them in to a unified set of data, recording data
> over a 25 year period.
> 2) The client does not want to pay for a full conversion to SQL - ie
> convert all of MS Access code to stored procedures - just wants the
> back end across to SQL for use with other tools (such as Cognos)
> 3) The batch process takes 12 hours on my little development
> environment and 24 hours plus on their corporate network
> (the original pure MS Access batch process take 45 minutes on my
> network and 6 hours on theirs)
> 4) The client is now starting to understand the need to move some of
> the processing in to the SQL server. For example - I have experimented
> and found that a delete query on an intermediate work table will take
> 10 minutes via the Access front end, but 5 seconds as a pass thru
> query. Also, some queries are taking 100 minutes to run across the
> network, and if I focus attention on turning these in to pass thru
> queries - I am sure I can drastically speed them up.
> QUESTION
> Before I put the client to the expense of additional development - are
> there any other steps I should follow first - ie are there any
> settings I should check on the SQL server.
> For example - I don't need transaction logging at all - its really a
> single user, batch application - if it crashes half way thru we can
> just restart it and it is built in such a way that it will sort itself
> out.
> I have to confess that I only know enough about SQL server to be
> dangerous - so if anyone could point me at some topics - I will go off
> and do some research - but at the moment I don't know where to start.
> Many thanks
> Tony
|||<ace join_to ware@.iinet.net.au (Tony Epton)> wrote in message
news:41cb69e9.7622515@.news.m.iinet.net.au...
> Continuing question about my MS Access to SQL conversion project :
> 0)Many thanks for previous assistance - we now have the batch process
> running successfully on the clients network
>
Good to hear.
> 1) It's an MS Access batch process using approximately 150 tables, 300
> queries - sounds like a mess but it is a reasonably disciplined and
> structured application - dealing with real world, very noisy data from
> 3 sources, massaging them in to a unified set of data, recording data
> over a 25 year period.
>
Yeah... there's "ideal" and "reality" :-)
> 2) The client does not want to pay for a full conversion to SQL - ie
> convert all of MS Access code to stored procedures - just wants the
> back end across to SQL for use with other tools (such as Cognos)
> 3) The batch process takes 12 hours on my little development
> environment and 24 hours plus on their corporate network
> (the original pure MS Access batch process take 45 minutes on my
> network and 6 hours on theirs)
Any idea why so much longer? More data or what?
> 4) The client is now starting to understand the need to move some of
> the processing in to the SQL server. For example - I have experimented
> and found that a delete query on an intermediate work table will take
> 10 minutes via the Access front end, but 5 seconds as a pass thru
> query. Also, some queries are taking 100 minutes to run across the
> network, and if I focus attention on turning these in to pass thru
> queries - I am sure I can drastically speed them up.
Yes. Generally in cases like this, as much as can be done on the server
should be. As you note, the speed improvements can be dramatic.
Ultimately this will probably sell them on moving more to SQL. Also, I'll
bet their network admins will notice the lower load as more is moved to the
DB and will thank you for it.
> QUESTION
> Before I put the client to the expense of additional development - are
> there any other steps I should follow first - ie are there any
> settings I should check on the SQL server.
"Maybe". There are some best practices, such as splitting log traffic to a
separate RAID 1 or RAID 10.
But, generally I'd look at code first. If you're already going from 600
seconds to 5 seconds, tweaking the server most likely won't get you another
120x improvement.
> For example - I don't need transaction logging at all - its really a
> single user, batch application - if it crashes half way thru we can
> just restart it and it is built in such a way that it will sort itself
> out.
Simple logging will help.
> I have to confess that I only know enough about SQL server to be
> dangerous - so if anyone could point me at some topics - I will go off
> and do some research - but at the moment I don't know where to start.
>
Sounds like you're off to a good start already.
> Many thanks
> Tony
Sunday, February 19, 2012
half baked conversion - speed issue - any low hanging fruit ?
Labels:
access,
assistance,
baked,
batch,
continuing,
conversion,
database,
fruit,
half,
hanging,
microsoft,
mysql,
oracle,
previous,
processrunning,
project,
server,
speed,
sql
Subscribe to:
Post Comments (Atom)
No comments:
Post a Comment