[lnkForumImage]
TotalShareware - Download Free Software

Confronta i prezzi di migliaia di prodotti.
Asp Forum
 Home | Login | Register | Search 


 

Forums >

microsoft.public.dotnet.framework

Batch of Stored Procedures

mj2736

3/5/2008 2:28:00 AM

In our .Net 2.0 web service we need to insert/update data records into
SQL Server 2005 that have been uploaded to us in XML documents. There
could be anywhere from one to literally thousands of records in each
uploaded transaction, and there are several dozen destination tables.

Since performance is critical for this particular app, we were
thinking of combining the stored procedure calls into configurable-
sized batches using CommandType.Text and multiple concatenated EXEC
statements, so we minimize database roundtrips as opposed to calling
each SP individually in its own SqlCommand.

Is this a good approach? One concern is that SQL Server would have to
compile each and every batch since they will all be different, and the
overhead involved in this could become significant. Is this really
something to worry about, or not? Wouldn't a few batch compiles per
transaction still be better than potentially thousands of roundtrips?
Are there other problems with this approach?

Of course we are planning to stress test to see how it actually
responds, but I just wanted to get some opinions first in case someone
has done this before and can give advice/recommendations/suggestions
before we get too far into it.

Thanks!
MJ
14 Answers

zzpat

5/30/2007 5:26:00 PM

0

George Grapman wrote:

>>
> Underestimating Thompson is a big mistake. He is not an actor turned
> politician like Reagan but a seasoned politician and a very competent
> attorney with a strong legal background going back more than 30 years.


Pro war candidates will be swimming upstream with a two-ton anvil tied
to their necks. They all fell for a lie and who wants a president
who's that stupid?

--
Impeach Bush
http://zzpat.brav...

Larry Hewitt

5/30/2007 8:07:00 PM

0


"George Grapman" <sfgeorge@paccbell.net> wrote in message
news:qog7i.8817$4Y.812@newssvr19.news.prodigy.net...
> Phlip wrote:
>>> Fred Dalton Thompson is planning to enter the presidential race over
>>> the
>>> Fourth of July holiday, announcing that week that he has already raised
>>> several million dollars and is being backed by insiders from the past
>>> three
>>> Republican administrations, Thompson advisers told The Politico.
>>
>> To beat him, emphasize how different the real deal is from his /Law
>> and Order/ liberal facade. Use that against him instead of allowing
>> him to hide behind it.
>>
>> --
>> Phlip
>>
> Underestimating Thompson is a big mistake. He is not an actor turned
> politician like Reagan but a seasoned politician and a very competent
> attorney with a strong legal background going back more than 30 years.

Thompson running is nothing but goodness for democrats.

Another social conservative running in a crowded field (what is he, the 4th,
5th "real" conservative) will fragment the vote from the repug religious
rightard base. Giuliani, the "moderate" in the field, will make out. And
Giuliani cannot win.

The same scenario played out in 2000. Bush, playing a moderate and a
"uniter" benefited from a gaggle of social conservatives eating each other's
lunch.

Of course, Bush was lying about his bonafides, but repugs didn;t know that.

Larry


OCVaughn

5/30/2007 9:26:00 PM

0


"Phlip" <phlip2005@gmail.com> wrote in message
news:1180536800.945389.309510@q19g2000prn.googlegroups.com...
>> Fred Dalton Thompson is planning to enter the presidential race over
>> the
>> Fourth of July holiday, announcing that week that he has already raised
>> several million dollars and is being backed by insiders from the past
>> three
>> Republican administrations, Thompson advisers told The Politico.
>
> To beat him, emphasize how different the real deal is from his /Law
> and Order/ liberal facade. Use that against him instead of allowing
> him to hide behind it.
>
> --
> Phlip
>

his poll numbers are dropping like a rock in Iowa and New Hampshire. His
ship is already sinking and he's still standing on the dock.

OC Vaughn


XTS

5/30/2007 11:08:00 PM

0


"George Grapman" <sfgeorge@paccbell.net> wrote in message
news:qog7i.8817$4Y.812@newssvr19.news.prodigy.net...
> Phlip wrote:
> >> Fred Dalton Thompson is planning to enter the presidential race over
the
> >> Fourth of July holiday, announcing that week that he has already raised
> >> several million dollars and is being backed by insiders from the past
three
> >> Republican administrations, Thompson advisers told The Politico.
> >
> > To beat him, emphasize how different the real deal is from his /Law
> > and Order/ liberal facade. Use that against him instead of allowing
> > him to hide behind it.
> >
> > --
> > Phlip
> >
> Underestimating Thompson is a big mistake. He is not an actor turned
> politician like Reagan but a seasoned politician and a very competent
> attorney with a strong legal background going back more than 30 years.

Fred represents everything this nation does not want in the next president.
Bush, Reagan, Cheney, stereotypical old white guy re-runs are out of favor.
This is 21st century America. Thomson represents just another republican
example of the GOP's effort to drag this country backwards in time. Let
them run the old fool, we'll make minced meat of him. In reality, Fred will
disrupt the GOP so much, he will be a great asset to the democrats march to
the Whitehouse.


David Moffitt

5/31/2007 1:57:00 AM

0


"Larry Hewitt" <larryhewi@comporium.net> wrote in message
news:f3klhb$nja$1@news04.infoave.net...
>
> "George Grapman" <sfgeorge@paccbell.net> wrote in message
> news:qog7i.8817$4Y.812@newssvr19.news.prodigy.net...
>> Phlip wrote:
>>>> Fred Dalton Thompson is planning to enter the presidential race over
>>>> the
>>>> Fourth of July holiday, announcing that week that he has already raised
>>>> several million dollars and is being backed by insiders from the past
>>>> three
>>>> Republican administrations, Thompson advisers told The Politico.
>>>
>>> To beat him, emphasize how different the real deal is from his /Law
>>> and Order/ liberal facade. Use that against him instead of allowing
>>> him to hide behind it.
>>>
>>> --
>>> Phlip
>>>
>> Underestimating Thompson is a big mistake. He is not an actor turned
>> politician like Reagan but a seasoned politician and a very competent
>> attorney with a strong legal background going back more than 30 years.
>
> Thompson running is nothing but goodness for democrats.
>
> Another social conservative running in a crowded field (what is he, the
> 4th, 5th "real" conservative) will fragment the vote from the repug
> religious rightard base. Giuliani, the "moderate" in the field, will make
> out. And Giuliani cannot win.
>
> The same scenario played out in 2000. Bush, playing a moderate and a
> "uniter" benefited from a gaggle of social conservatives eating each
> other's lunch.
>
> Of course, Bush was lying about his bonafides, but repugs didn;t know
> that.
>
> Larry

%%%% As predicted Thompson has you scared.

We have another NEW grouping.... "Cowardly sniveling leftard girly-boyz
against Bush" ---- Amanda (girl-boy) Williams revealing his ?orientation?
11/30/2006 in alt.fan.rush-limbaugh

>
>


.

6/1/2007 5:30:00 PM

0

David Moffitt wrote:
> "Larry Hewitt" <larryhewi@comporium.net> wrote in message
> news:f3klhb$nja$1@news04.infoave.net...
>> "George Grapman" <sfgeorge@paccbell.net> wrote in message
>> news:qog7i.8817$4Y.812@newssvr19.news.prodigy.net...
>>> Phlip wrote:
>>>>> Fred Dalton Thompson is planning to enter the presidential race over
>>>>> the
>>>>> Fourth of July holiday, announcing that week that he has already raised
>>>>> several million dollars and is being backed by insiders from the past
>>>>> three
>>>>> Republican administrations, Thompson advisers told The Politico.
>>>> To beat him, emphasize how different the real deal is from his /Law
>>>> and Order/ liberal facade. Use that against him instead of allowing
>>>> him to hide behind it.
>>>>
>>>> --
>>>> Phlip
>>>>
>>> Underestimating Thompson is a big mistake. He is not an actor turned
>>> politician like Reagan but a seasoned politician and a very competent
>>> attorney with a strong legal background going back more than 30 years.
>> Thompson running is nothing but goodness for democrats.
>>
>> Another social conservative running in a crowded field (what is he, the
>> 4th, 5th "real" conservative) will fragment the vote from the repug
>> religious rightard base. Giuliani, the "moderate" in the field, will make
>> out. And Giuliani cannot win.
>>
>> The same scenario played out in 2000. Bush, playing a moderate and a
>> "uniter" benefited from a gaggle of social conservatives eating each
>> other's lunch.
>>
>> Of course, Bush was lying about his bonafides, but repugs didn;t know
>> that.
>>
>> Larry
>
> %%%% As predicted Thompson has you scared.
>
> We have another NEW grouping.... "Cowardly sniveling leftard girly-boyz
> against Bush" ---- Amanda (girl-boy) Williams revealing his ?orientation?
> 11/30/2006 in alt.fan.rush-limbaugh
>
>>
>
>
As predicted, Muffin is still a steaming little pile of right wing shit..

Mufaka

3/5/2008 5:31:00 AM

0

mj2736@yahoo.com wrote:
> In our .Net 2.0 web service we need to insert/update data records into
> SQL Server 2005 that have been uploaded to us in XML documents. There
> could be anywhere from one to literally thousands of records in each
> uploaded transaction, and there are several dozen destination tables.
>
> Since performance is critical for this particular app, we were
> thinking of combining the stored procedure calls into configurable-
> sized batches using CommandType.Text and multiple concatenated EXEC
> statements, so we minimize database roundtrips as opposed to calling
> each SP individually in its own SqlCommand.
>
> Is this a good approach? One concern is that SQL Server would have to
> compile each and every batch since they will all be different, and the
> overhead involved in this could become significant. Is this really
> something to worry about, or not? Wouldn't a few batch compiles per
> transaction still be better than potentially thousands of roundtrips?
> Are there other problems with this approach?
>
> Of course we are planning to stress test to see how it actually
> responds, but I just wanted to get some opinions first in case someone
> has done this before and can give advice/recommendations/suggestions
> before we get too far into it.
>
> Thanks!
> MJ
For SQL 2005, I would use SQL's built in XML processing. You can pass
the xml text in as a parameter to a stored procedure.

Here's a snippet from what I have done with something similar:

/*
declare @XmlDoc nvarchar(4000)
set @XmlDoc =
'<patient>
<answer>
<PatientID>13303</PatientID>
<QuestionID>200</QuestionID>
<AppointmentID>105084</AppointmentID>
<AnswerValue>217,218</AnswerValue>
<SaveTypeID>1</SaveTypeID>
</answer>
</patient>'

exec up_PatientAnswer_Save_FromXml @XmlDoc = @XmlDoc
*/
ALTER procedure [dbo].[up_PatientAnswer_Save_FromXml]
(
@XmlDoc ntext
)
as
set nocount on

declare @Answers table
(
AnswerID Int identity,
PatientID Int,
QuestionID Int,
AppoIntmentID Int,
SaveTypeID Int,
AnswerValue nvarchar(3900) null,
AnswerValueDate datetime null,
AnswerValueInt Int null
)

declare @Pointer Int

-- get a pointer to the parsed doc
exec sp_xml_preparedocument @Pointer output, @XmlDoc

-- insert Into the table var so we can dispose of the xml doc
-- and work with the data easier (better)
insert Into @Answers
(PatientID, QuestionID, AppointmentID, SaveTypeID, AnswerValue,
AnswerValueDate, AnswerValueInt)
select
PatientID, QuestionID, AppointmentID, SaveTypeID, AnswerValue,
AnswerValueDate, AnswerValueInt
from
openxml(@Pointer, '/patient/answer', 2)
with
(
PatientID int,
QuestionID int,
AppointmentID int,
SaveTypeID int,
AnswerValue nvarchar(3900),
AnswerValueDate datetime,
AnswerValueInt int
)

-- release the xml doc
exec sp_xml_removedocument @Pointer

-- processing from here is done with the data in the table variable


More info can be found here:

http://www.perfectxml.com/articles/xml/o...

If the XML that you receive is usable as is, you can just pass that
along. Or you can create new XML that is in a format that you expect.

You may also be able to us SqlBulkCopy to insert the data into a loading
table and have a stored procedure that works off of that table.

In SQL 2008, you will be able to pass a DataTable as a parameter to a
stored proc.

mj2736

3/5/2008 2:34:00 PM

0

Thanks for the suggestion. We've actually tried this - passing the
data in as XML - but found it to be way too resource intensive on the
database server for our purposes, particularly with large XML
documents. We observed occasional 'out of memory' errors despite
calling sp_xml_removedocument every time when finished. So we have
already ruled this out.

I am interested if anyone has any opinions about the idea of
concatenating multiple SP calls into a batch and sending the whole
batch to the database in one shot as CommandType.Text.

Thanks,
MJ

Alex Kuznetsov

3/5/2008 2:48:00 PM

0

On Mar 4, 8:27 pm, mj2...@yahoo.com wrote:
> In our .Net 2.0 web service we need to insert/update data records into
> SQL Server 2005 that have been uploaded to us in XML documents. There
> could be anywhere from one to literally thousands of records in each
> uploaded transaction, and there are several dozen destination tables.
>
> Since performance is critical for this particular app, we were
> thinking of combining the stored procedure calls into configurable-
> sized batches using CommandType.Text and multiple concatenated EXEC
> statements, so we minimize database roundtrips as opposed to calling
> each SP individually in its own SqlCommand.
>
> Is this a good approach? One concern is that SQL Server would have to
> compile each and every batch since they will all be different, and the
> overhead involved in this could become significant. Is this really
> something to worry about, or not? Wouldn't a few batch compiles per
> transaction still be better than potentially thousands of roundtrips?
> Are there other problems with this approach?
>
> Of course we are planning to stress test to see how it actually
> responds, but I just wanted to get some opinions first in case someone
> has done this before and can give advice/recommendations/suggestions
> before we get too far into it.
>
> Thanks!
> MJ

SQL Server MVP Erland Sommarskog wrote an excellent article "Arrays
and Lists in SQL". We use his approach. We pack up to 100K values in
an image, that mimics an array. We can use several images. Performs
very fast.

Cowboy

3/5/2008 3:32:00 PM

0

I still think XML may be your best option, but preload the XML into a single
table and then run processes on the data there. This works best if you can
create a table to load. You may need multiple load tables if you have
different clients. This will still hammer the server for a bit. It should
not be an issue for thousands of records, however. After you fill one table,
you need to get its primary key values into the load table, especially if
using row guids or IDENTITY.

When you get to millions of records, this can still cause issues with the
server. In these cases, you may want to preprocess outside of SQL Server. If
you opt for this, you essentially create bcp files (or files for
BULK_INSERT) outside of SQL. When you get to the step to create IDENTITY
values, you can use DBCC commands to increment a "hole" the size of the
records being loaded. The customer's unique key and new IDENTITY driven
primary key can be loaded into a hashtable. Load time for a million records
is about 2-3 seconds with 1 million lookups takign far less than a second.
At least this was the stats we worked with when I did a project that
preprocessed offline.

If you can load into a single table from XML and then run commands, it is
less time consuming to code.

What about batching? It is a possibility, but realize you are just
flattening the performance curve. Rather than a single huge hit, you are
taking many smaller hits. This could well be the solution you are looking
for, but you may find that this approach fails as you grow in size. Of
course, you can always move SQL Server to a big box, the data files to a
SAN, etc. to alleviate some, if not all, issues. At least until you grow to
millions of files being loaded this way.

Another potential option is getting more transactional with the process that
creates the XML so single records are sent. If this is a client, it may not
be an option. Even internal, there may be valid reasons not to go record by
record.

Also, in most of these systems, you have to check for existing records
before loading, as inevetably someone sends you the same info. You may have
some system of checks and balances here to avoid this. If not, consider it,
as a process failure in the middle of a large batch of records can be
extremely hard to cure.

--
Gregory A. Beamer
MVP, MCP: +I, SE, SD, DBA

*************************************************
| Think outside the box!
|
*************************************************
<mj2736@yahoo.com> wrote in message
news:e95705f2-1199-473f-ba0a-62acce54aa36@n36g2000hse.googlegroups.com...
> In our .Net 2.0 web service we need to insert/update data records into
> SQL Server 2005 that have been uploaded to us in XML documents. There
> could be anywhere from one to literally thousands of records in each
> uploaded transaction, and there are several dozen destination tables.
>
> Since performance is critical for this particular app, we were
> thinking of combining the stored procedure calls into configurable-
> sized batches using CommandType.Text and multiple concatenated EXEC
> statements, so we minimize database roundtrips as opposed to calling
> each SP individually in its own SqlCommand.
>
> Is this a good approach? One concern is that SQL Server would have to
> compile each and every batch since they will all be different, and the
> overhead involved in this could become significant. Is this really
> something to worry about, or not? Wouldn't a few batch compiles per
> transaction still be better than potentially thousands of roundtrips?
> Are there other problems with this approach?
>
> Of course we are planning to stress test to see how it actually
> responds, but I just wanted to get some opinions first in case someone
> has done this before and can give advice/recommendations/suggestions
> before we get too far into it.
>
> Thanks!
> MJ