I am making practical problems from the MCTS 70-536 Microsft.Net Framework Dev Foundation exam, and one of the problems is to create two classes: one common, one type of object that does the same thing; in which the loop uses the class and repeats over a thousand times. And using a timer, the execution time of both. There was another post in a C # generics question that was looking for the same questoion, but nonone answered.
Basically, if I run a generic class in my code, it processes the loger first. If I run an object class first than an object class, it takes more time to process. The whole idea was to prove that generics are faster.
I used the user's source code to save time. I did not see anything bad in the code and was puzzled by the result. Can anyone explain why the unusual results?
Thank,
Risho
Here is the code:
class Program
{
class Object_Sample
{
public Object_Sample()
{
Console.WriteLine("Object_Sample Class");
}
public long getTicks()
{
return DateTime.Now.Ticks;
}
public void display(Object a)
{
Console.WriteLine("{0}", a);
}
}
class Generics_Samle<T>
{
public Generics_Samle()
{
Console.WriteLine("Generics_Sample Class");
}
public long getTicks()
{
return DateTime.Now.Ticks;
}
public void display(T a)
{
Console.WriteLine("{0}", a);
}
}
static void Main(string[] args)
{
long ticks_initial, ticks_final, diff_generics, diff_object;
Object_Sample OS = new Object_Sample();
Generics_Samle<int> GS = new Generics_Samle<int>();
ticks_initial = 0;
ticks_final = 0;
ticks_initial = GS.getTicks();
for (int i = 0; i < 50000; i++)
{
GS.display(i);
}
ticks_final = GS.getTicks();
diff_generics = ticks_final - ticks_initial;
ticks_initial = 0;
ticks_final = 0;
ticks_initial = OS.getTicks();
for (int j = 0; j < 50000; j++)
{
OS.display(j);
}
ticks_final = OS.getTicks();
diff_object = ticks_final - ticks_initial;
Console.WriteLine("\nPerformance of Generics {0}", diff_generics);
Console.WriteLine("Performance of Object {0}", diff_object);
Console.ReadKey();
}
}
Risho source
share