I am running a LINQ query that uses let local variables to compute a local value to compare with the parameter that is passed to the query.
I have the following query:
Items = from lead in Items let total = lead.NurtureActions.Count(q => !q.Deleted) let completed = lead.NurtureActions.Count(nurtureAction => nurtureAction.Completed && !nurtureAction.Deleted) let percentage = Math.Round((Double)((completed/total)*100),0,MidpointRounding.ToEven) where total>0 && percentage == progress select lead;
The part that really matters is the following line:
let percentage = Math.Round((Double)((completed/total)*100),0,MidpointRounding.ToEven)
As you can see in my request, I compare the result of this request with progress , which is passed to my function.
For example: I can pass the value 8 , so progress will have the value 8 . But the calculated percentage may be initially 8.223 , but I need to round to 8 .
I thought I was doing it right, but for some reason, something was not working correctly.
Any ideas on what might reset rounding? I also tried the AwayFromZero parameter for rounding, but that doesn't work either.
Edit For those who have requested more information, I'm not sure what value it calculates. I do not experience debugging LINQ queries and do not know where to start to find out what this value is. I would provide it if I knew how to get it.
source share