DataInconsistent in SQL Server 2000 due to long transaction

We are faced with a situation on one of our production servers. We have a specific Storage procedure that performs the Insert operation on one of the largest tables in the database (it has more than several million rows). This table is the most loaded table in the database and has many operations depending on it.

Recently, we ran into a problem on one specific production server.

We run the Insert package along with some other SP updates in one transaction, and we encounter the โ€œLong-Term Transactionโ€ problem for Insert SP quite regularly. When we get this problem, we find typical behavior in the data that is inserted into the table. The datetime column value is entered as "null". This happens several times for all lines and for a while for several lines. The datetime value is passed from the application. But other update operations performed before and after the insert operation work well.

We traced the sql profiler in our test environment (not on the production server), but found that the datetime value was taken properly every time.

Also, when we encounter a problem in production, we observe that:

  • @@ trancount is '0', but DBCC OPENTRAN displays a specific open transaction.
  • The Last Wait type is set to "NETWORKIO".
  • Waittype - '0x0000'.
  • Sleeping status.
  • READ UNCOMMITTED INSULATION LEVEL.

So our concern

  • Why is datetime entered as "NULL" only in this particular situation?
  • How to avoid this situation, as well as a long transaction?
  • What could be causing this situation on one specific server?

Thank you in advance,

Abhijit

+6
sql sql-server-2000
source share
3 answers

There are several things that, it seems to me, need to be addressed simultaneously. First and foremost, I highly recommend that you take a look at this transaction to see what you can do to reduce the load it puts on your system. Can it be broken down into several small transactions? Would it help to add better indexes? Would it capture a subset of data and put it in a temporary table to reduce the number of SELECTS running against the main table? The list of questions here may go on for some time.

Then look at the "application" that starts the transaction. Are these dates? If so, how is this done? If it just passes into GetDate() , then this allows SQL Server to do the work. However, if the application passes the date value, I would make sure that this date value is ALWAYS valid. If so, look at the table to make sure the date formatting is configured correctly. For example, if your application was transmitted in the format of the European date 14-05-2011 , your application may suffocate if it expects a Month-Day-Year, since 14 is not converted per month.

Third, review your tables to see if you have triggers configured on them. If you do, look carefully at each trigger. It is very possible that a trigger is causing a conflict within your transaction. Perhaps you are writing data, and your trigger returns and updates the data (or evaluates the date and states that it is invalid - see the example above).

Fourth, check the data before leaving the transaction. Read it when it appears, and read it after performing INSERT . Perhaps after the transaction is completed, a client operation occurs that destroys the date value.

Finally, you need to look at your test environment. If this works in the test, but it does not work, there is something else between the two systems. This is the reason, directly or indirectly. Maybe this is due to the hardware (bad RAM?) Or maybe it is something else in the settings (Locking, the fact that clients are trying to perform actions against data, etc.).

Other than that, here is a link to the forum with some other potential answers for you:

http://groups.google.com/group/comp.databases.ms-sqlserver/browse_thread/thread/1063b65df1f97492/8649bee2002646a2

+1
source share

It looks like you have a mess of code. It would be wise to isolate your inserts and updates, etc. In smaller transactions.

To check where your hover occurs, you can simply add several steps to your procedure in the form of a print ("Made so far ..."). Then follow the procedure from SSMS and check the message box. This is an ugly way of debugging, but if you don't want to use smaller transactions or the @@ trap, etc., This might be your best bet.

+1
source share

"We perform SP insertion along with some other SP updates in one transaction"

I would recommend untying your inserts and updates. If you know that you need to make some updates after insertion, update triggers might be the correct answer.

I also offer you a lock / lock profile. I say this because from what you said, you can very well go along the same lines more than once in your root transaction, based on the likely argument that you invested all this activity in the same process.

+1
source share

All Articles