google
3/11/2008 2:18:00 PM
On Feb 22, 6:32 pm, "Cowboy \(Gregory A. Beamer\)"
<NoSpamMgbwo...@comcast.netNoSpamM> wrote:
> Definitely.
>
> When you first start using a new database it creates stats. Until created,
> the first few queries run slowly, as will any unique type of query, as it
> cannot hit stats.
>
> --
> Gregory A. Beamer
> MVP, MCP: +I, SE, SD, DBA
>
> *************************************************
> | Think outside the box!
> |
> *************************************************"dustbort" <d_nospam_bortner@rockcreekglobal_nospam_.com> wrote in message
>
> news:uTjHfcYdIHA.3940@TK2MSFTNGP05.phx.gbl...
>
>
>
> > Gregory,
>
> > Thanks for your ideas. I went to test them and it seems that the problem
> > has miraculously vanished. I had just copied the production database to
> > my workstation and updated the stored procedures for testing, prior to
> > having the slow results. I wonder if the database does some
> > initialization that had not completed at the time of the first tests?
> > Something that would affect the .NET SqlClient but not SQL Mgmt Studio?
>
> > "Cowboy (Gregory A. Beamer)" <NoSpamMgbwo...@comcast.netNoSpamM> wrote in
> > messagenews:u3dE0KYdIHA.2268@TK2MSFTNGP02.phx.gbl...
> >> Make sure you are testing properly. Run the query twice from ADO.NET and
> >> see if you have a different read time. There are two reasons I can think
> >> of, without much thought, why you might get different read times from #1
> >> to #2.
>
> >> 1. JIT time
> >> 2. Stats being compiled on SQL Server
>
> >> If you find a difference, it may just be preloading your app before use
> >> will solve your issue. If not, here are a variety of things you can look
> >> at:
>
> >> 1. Library used to connect to SQL Server - If you can use In Memory, it
> >> is fastest; none should be overall slow, so this is just shaving a bit of
> >> time, not a full solution
> >> 2. ADO.NET library used (ODBC, OLEDB or SQL)
> >> 3. Check the indexes
> >> a) Any fragmented indexes need to been recompiled
> >> b) Do you have the right indexes for your query
> >> 4. Would adding some locking hints, etc. help your query?
> >> 5. Do you need hints for statistics?
> >> 6. Can you improve the query? - Look at the execution plan, as it will
> >> give you a lot of information on straightening out your world
>
> >> If you are dynamically adding statements in a sproc, you will find that
> >> it will end up having to recompile each time and redo its stats. When you
> >> compare this to the profiler command (meaning what is actually run in the
> >> sproc versus running the sproc statement from profiler), you can see a
> >> huge difference in time. This is why you need to make sure you are
> >> running the sproc in each instance and not just the statement. Depending
> >> on how you profile, you might only see the sproc call.
>
> >> --
> >> Gregory A. Beamer
> >> MVP, MCP: +I, SE, SD, DBA
>
> >> *************************************************
> >> | Think outside the box! |
> >> *************************************************
> >> "dustbort" <d_nospam_bortner@rockcreekglobal_nospam_.com> wrote in
> >> messagenews:%2321JRBYdIHA.4220@TK2MSFTNGP03.phx.gbl...
> >>>I have a stored procedure that when called from SQL 2005 Management
> >>>Studio
> >>> takes less than one second to execute. The same SP, when called from
> >>> .NET
> >>> code takes about 13 seconds. I am using a SqlCommand object with
> >>> CommandType set to StoredProcedure and I am passing arguments through
> >>> the
> >>> parameters collection. I have tried using a SqlDataReader and a
> >>> DataAdapter
> >>> to retrieve the data, but both are equally slow. From stepping thru the
> >>> debugger, I know that the specific statement that takes a long time to
> >>> execute is either reader.ExecuteReader() or dataAdapter.Fill(dataTable),
> >>> equivalently depending on the method I tried. I did a trace in the
> >>> Profiler, and got nearly identical result for either method of .NET
> >>> SqlClient Data Provider. Here is an example:
>
> >>> EventClass: RPC:Completed
> >>> CPU: 13390
> >>> Reads: 559475
> >>> Writes: 0
> >>> Duration: 13496
> >>> Binary Data: (a long hex value)
>
> >>> When I copied the TextData (SQL Statement being executed) from Profiler
> >>> into
> >>> SQL Management Studio, I get the following trace:
>
> >>> EventClass: SQL:BatchCompleted
> >>> CPU: 437
> >>> Reads: 9998
> >>> Writes: 0
> >>> Duration: 440
> >>> BinaryData: (empty)
>
> >>> (Immediately prior to this there is a corresponding SQL:BatchStarting
> >>> trace,
> >>> with empty CPU, Reads, Writes, and Duration columns.)
>
> >>> What could explain the orders of magnitude difference in reads and
> >>> duration?
> >>> Is the problem due to RPC? What about the binary data? I have tried
> >>> using
> >>> the overload of ExecuteReader(CommandBehavior.SingleResult) with no
> >>> improvement. What can I try to improve it?
>
> >>> Thanks,
> >>> Dustin- Hide quoted text -
>
> - Show quoted text -
I had an similar issue. I think it's due to the fragmentation of the
tables. Look at the indexes on the tables.