Free NHibernate - Unnecessary Update

In the one-to-many-to-many relationship between Registration and Item , where a Registration has ISet<Item> ItemsPurchased and Item has no registration links (this is not a very useful way to examine the object graph), when I look at the generated SQL, I I see

 INSERT INTO Registrations_Items (RegistrationId, ItemId) VALUES (@p0, @p1);@p0 = 1 [Type: Int32 (0)], @p1 = 1 [Type: Int32 (0)] UPDATE Items SET Price = @p0, Name = @p1, [...], ListIndex = @p5, EventId = @p6 WHERE ItemId = @p7 

The parameters passed to the update are correct, but nothing has changed, so the update is not required.

The mapping is done by autopilot with this override for Registration and without overrides for Item . DB schema looks absolutely correct. I deleted all agreements and tested again, and the behavior is preserved, so this does not apply to my matching agreements.

mapping.HasManyToMany(e => e.ItemsPurchased).AsSet().Cascade.All().Not.Inverse();

Why is NHibernate making this UPDATE call and what can I do to stop it? It doesn’t really hurt, but it says that I did something wrong, so I would like to find out that.

Edit: For the next comment, I created a unit test that creates an Event (the Item must belong to Event ), adds two Items to it, returns the first of the session and the flushes session, and then gets the first back by its identifier.

I noticed something strange in the SELECT line below (second from bottom)

 INSERT INTO Events (blah blah blah...) select @@IDENTITY INSERT INTO Items (Price, Name, StartDate, EndDate, ExternalID, ListIndex, EventId) VALUES (@p0, @p1, @p2, @p3, @p4, @p5, @p6);@p0 = 100.42 [Type: Decimal (0)], @p1 = 'Item 1' [Type: String (0)], @p2 = NULL [Type: DateTime (0)], @p3 = NULL [Type: DateTime (0)], @p4 = '123' [Type: String (0)], @p5 = 0 [Type: Int32 (0)], @p6 = 1 [Type: Int32 (0)] select @@IDENTITY SELECT blah blah blah FROM Events event0_ WHERE event0_.EventId=@p0 ;@p0 = 1 [Type: Int32 (0)] SELECT itemsforsa0_.EventId as EventId1_, itemsforsa0_.ItemId as ItemId1_, itemsforsa0_.ListIndex as ListIndex1_, itemsforsa0_.ItemId as ItemId3_0_, itemsforsa0_.Price as Price3_0_, itemsforsa0_.Name as Name3_0_, itemsforsa0_.StartDate as StartDate3_0_, itemsforsa0_.EndDate as EndDate3_0_, itemsforsa0_.ExternalID as ExternalID3_0_, itemsforsa0_.ListIndex as ListIndex3_0_, itemsforsa0_.EventId as EventId3_0_ FROM Items itemsforsa0_ WHERE itemsforsa0_.EventId=@p0 ;@p0 = 1 [Type: Int32 (0)] UPDATE Items SET Price = @p0, Name = @p1, StartDate = @p2, EndDate = @p3, ExternalID = @p4, ListIndex = @p5, EventId = @p6 WHERE ItemId = @p7;@p0 = 100.42000 [Type: Decimal (0)], @p1 = 'Item 1' [Type: String (0)], @p2 = NULL [Type: DateTime (0)], @p3 = NULL [Type: DateTime (0)], @p4 = '123' [Type: String (0)], @p5 = 0 [Type: Int32 (0)], @p6 = 1 [Type: Int32 (0)], @p7 = 1 [Type: Int32 (0)] , itemsforsa0_.ListIndex as ListIndex1_, itemsforsa0_.ItemId as ItemId3_0_, itemsforsa0_.Price as Price3_0_, itemsforsa0_.Name as Name3_0_, itemsforsa0_.StartDate as StartDate3_0_, itemsforsa0_.EndDate as EndDate3_0_, itemsforsa0_ INSERT INTO Events (blah blah blah...) select @@IDENTITY INSERT INTO Items (Price, Name, StartDate, EndDate, ExternalID, ListIndex, EventId) VALUES (@p0, @p1, @p2, @p3, @p4, @p5, @p6);@p0 = 100.42 [Type: Decimal (0)], @p1 = 'Item 1' [Type: String (0)], @p2 = NULL [Type: DateTime (0)], @p3 = NULL [Type: DateTime (0)], @p4 = '123' [Type: String (0)], @p5 = 0 [Type: Int32 (0)], @p6 = 1 [Type: Int32 (0)] select @@IDENTITY SELECT blah blah blah FROM Events event0_ WHERE event0_.EventId=@p0 ;@p0 = 1 [Type: Int32 (0)] SELECT itemsforsa0_.EventId as EventId1_, itemsforsa0_.ItemId as ItemId1_, itemsforsa0_.ListIndex as ListIndex1_, itemsforsa0_.ItemId as ItemId3_0_, itemsforsa0_.Price as Price3_0_, itemsforsa0_.Name as Name3_0_, itemsforsa0_.StartDate as StartDate3_0_, itemsforsa0_.EndDate as EndDate3_0_, itemsforsa0_.ExternalID as ExternalID3_0_, itemsforsa0_.ListIndex as ListIndex3_0_, itemsforsa0_.EventId as EventId3_0_ FROM Items itemsforsa0_ WHERE itemsforsa0_.EventId=@p0 ;@p0 = 1 [Type: Int32 (0)] UPDATE Items SET Price = @p0, Name = @p1, StartDate = @p2, EndDate = @p3, ExternalID = @p4, ListIndex = @p5, EventId = @p6 WHERE ItemId = @p7;@p0 = 100.42000 [Type: Decimal (0)], @p1 = 'Item 1' [Type: String (0)], @p2 = NULL [Type: DateTime (0)], @p3 = NULL [Type: DateTime (0)], @p4 = '123' [Type: String (0)], @p5 = 0 [Type: Int32 (0)], @p6 = 1 [Type: Int32 (0)], @p7 = 1 [Type: Int32 (0)] 

The table is created correctly:

 create table Items ( ItemId INT IDENTITY NOT NULL, Price NUMERIC(19,5) not null, Name NVARCHAR(255) not null, StartDate DATETIME null, EndDate DATETIME null, ExternalID NVARCHAR(255) not null, ListIndex INT not null, EventId INT not null, primary key (ItemId) ) 

DateTimes are intentionally invalid because the item may not need a specific date (an example would be "early bird registration").

+7
source share
2 answers

This is called: Phantom Updates , this is usually related to the display of your objects.

This is the main reason:

Imagine we have such an object

 public class Product { public Guid Id { get; set; } public int ReorderLevel { get; set; } public decimal UnitPrice { get; set; } } 

and display:

 public class ProductMap : ClassMap<Product> { public ProductMap() { Not.LazyLoad(); Id(x => x.Id).GeneratedBy.GuidComb(); Map(x => x.ReorderLevel); Map(x => x.UnitPrice).Not.Nullable(); } } 

Note that ReorderLevel will be null

If you save this object without specifying ReorderLevel , it will be saved with a null value, but then when you load it from the database, since the ReorderLevel type is int , a 0 , which will cause the object to be marked as dirty, and so it will update

Such errors are difficult to detect and track, I recommend using Nullable<> types when you really need null in the database

I usually do this to create an agreement that will automatically set my Value Types to null if they are declared using Nullable<> , otherwise the field will be marked as NotNullable

Just to complement, this is what my agreement looks like:

  mapper.BeforeMapProperty += (ins, memb, cust) => { var type = memb.LocalMember.GetPropertyOrFieldType(); if (type.IsValueType) { if (type.IsGenericType && type.GetGenericTypeDefinition() == typeof(Nullable<>)) { cust.Column(x => { x.NotNullable(notnull: false); }); } else { cust.Column(x => { x.NotNullable(notnull: true); }); } } } 
+9
source

As stated above (below “who knows. Look for the comment I left on another answer”), I noticed that the difference between CanGenerateDatabaseSchema unit test and CanGetItem unit test was that me DECIMAL (6,2) and the other gave me DECIMAL (19,0) .

I thought a lot and realized that CanGenerateDatabaseSchema used my "real" config (from a web project), and another test used my "unit test" configuration. My unit tests were run against Sql Server CE ... when I changed my unit tests to use the same configuration as my real database (Sql Server 2005), the phantom update suddenly disappeared.

So, if anyone else comes across unexpected phantom decimal updates ... check if you are using Sql Server CE. Since the test actually passes (a comment saying that the failure is incorrect, does not fail, just does additional work), I think I will live with it, although why Sql CE ignores my configuration, this is a good question, and a possible error NH or FNH.

+1
source

All Articles