Generics versus object performance

I am making practical problems from the MCTS 70-536 Microsft.Net Framework Dev Foundation exam, and one of the problems is to create two classes: one common, one type of object that does the same thing; in which the loop uses the class and repeats over a thousand times. And using a timer, the execution time of both. There was another post in a C # generics question that was looking for the same questoion, but nonone answered.

Basically, if I run a generic class in my code, it processes the loger first. If I run an object class first than an object class, it takes more time to process. The whole idea was to prove that generics are faster.

I used the user's source code to save time. I did not see anything bad in the code and was puzzled by the result. Can anyone explain why the unusual results?

Thank,

Risho

Here is the code:

class Program
{
    class Object_Sample
    {            
        public Object_Sample()
        {
            Console.WriteLine("Object_Sample Class");
        }

        public long getTicks()
        {
            return DateTime.Now.Ticks;
        }

        public void display(Object a)
        {
            Console.WriteLine("{0}", a);
        }
    }

    class Generics_Samle<T>
    {            
        public Generics_Samle()
        {
            Console.WriteLine("Generics_Sample Class");
        }

        public long getTicks()
        {
            return DateTime.Now.Ticks;
        }

        public void display(T a)
        {
            Console.WriteLine("{0}", a);
        }
    }

    static void Main(string[] args)
    {            
        long ticks_initial, ticks_final, diff_generics, diff_object;
        Object_Sample OS = new Object_Sample();
        Generics_Samle<int> GS = new Generics_Samle<int>();

        //Generic Sample
        ticks_initial = 0;
        ticks_final = 0;
        ticks_initial = GS.getTicks();

        for (int i = 0; i < 50000; i++)
        {
            GS.display(i);
        }
        ticks_final = GS.getTicks();
        diff_generics = ticks_final - ticks_initial;

        //Object Sample
        ticks_initial = 0;
        ticks_final = 0;
        ticks_initial = OS.getTicks();

        for (int j = 0; j < 50000; j++)
        {
            OS.display(j);
        }

        ticks_final = OS.getTicks();
        diff_object = ticks_final - ticks_initial;

        Console.WriteLine("\nPerformance of Generics {0}", diff_generics);
        Console.WriteLine("Performance of Object {0}", diff_object);

        Console.ReadKey();
    }
}
+5
source share
5 answers

Your test is incorrect. Here are your methods:

public void display(T a)
{
    Console.WriteLine("{0}", a); // Console.WriteLine(string format, params object[] args) <- boxing is performed here
}

public void display(Object a)// <- boxing is performed here
{
    Console.WriteLine("{0}", a); 
}

So, in both cases, you use boxing. It would be much better if your class, for example, calculated the total amount of values, for example:

public void add(long a)
{
    Total += a;
}

public void display(Object a)// <- boxing is performed here
{
    Total += (long) a;// <- unboxing is performed here
}
+5
source

Well, the first problem I see is that you are using a DateTime object to measure the time in the application (for a very small interval).

You must use the Stopwatch class . It offers better accuracy when testing code.

, JIT ( Just-In-Time). , JIT'd. .

, , , .

+9

, , JITter .

+8

? int , Console.WriteLine(, )

edit: ToString() http://weblogs.asp.net/ngur/archive/2003/12/16/43856.aspx

, Console.WriteLine(a); Console.WriteLine(Int32), , ( , )

+3
source
  • Your time code includes Console.WriteLine(). It will take 99.999999% of the time.
  • Your assumption that the birth will be faster in this situation is incorrect. You may have misinterpreted the remark about classes that are not related to generation.
  • It will not be on the exam.
+3
source

All Articles