Most efficient way to add arrays in C #?

I am retrieving data from old school ActiveX as arrays of doubles. I initially do not know the final number of samples that I will really extract.

What is the most efficient way to combine these arrays in C # when I pull them out of the system?

+53
memory-management arrays c #
Nov 20 '08 at 9:54
source share
10 answers

You cannot add to a real array - the size of the array is fixed at creation time. Instead, use a List<T> , which can grow as needed.

Also, save the list of arrays and concatenate them all only when you grabbed everything.

See Eric Lippert's blog post on arrays for more details and insights than I could really provide :)

+67
Nov 20 '08 at 9:58
source share
— -

I believe that if you have 2 arrays of the same type that you want to combine into a third array, there is a very simple way to do this.

here is the code:

 String[] theHTMLFiles = Directory.GetFiles(basePath, "*.html"); String[] thexmlFiles = Directory.GetFiles(basePath, "*.xml"); List<String> finalList = new List<String>(theHTMLFiles.Concat<string>(thexmlFiles)); String[] finalArray = finalList.ToArray(); 
+24
Dec 23 '09 at 16:14
source share

I recommend the answer found here: How to combine two arrays in C #?

eg.

 var z = new int[x.Length + y.Length]; x.CopyTo(z, 0); y.CopyTo(z, x.Length); 
+20
May 2 '12 at 10:26
source share

Concatenation arrays are simple using the linq extensions that come standard with .Net 4

The most important thing to remember is that linq works with IEnumerable<T> objects, so to get an array as a result, you have to use the .ToArray() method at the end

An example of combining two byte arrays:

 byte[] firstArray = {2,45,79,33}; byte[] secondArray = {55,4,7,81}; byte[] result = firstArray.Concat(secondArray).ToArray(); 
+17
Jul 12 2018-12-12T00:
source share

The solution looks a lot of fun, but you can combine arrays in only two statements. When you process arrays of large bytes, I believe it is inefficient to use a linked list to contain each byte.

Here is a sample code for reading bytes from a stream and expanding a byte array on the fly:

     byte [] buf = new byte [8192];
     byte [] result = new byte [0];
     int count = 0;
     do
     {
         count = resStream.Read (buf, 0, buf.Length);
         if (count! = 0)
         {
             Array.Resize (ref result, result.Length + count);
             Array.Copy (buf, 0, result, result.Length - count, count);
         }
     }
     while (count> 0);  // any more data to read?
     resStream.Close ();
+6
Sep 01 '09 at 8:22
source share

Using this, we can add two arrays with any loop.

I believe that if you have 2 arrays of the same type that you want to combine into one of the arrays, there is a very simple way to do this.

Here is the code:

 String[] TextFils = Directory.GetFiles(basePath, "*.txt"); String[] ExcelFils = Directory.GetFiles(basePath, "*.xls"); String[] finalArray = TextFils.Concat(ExcelFils).ToArray(); 

or

 String[] Fils = Directory.GetFiles(basePath, "*.txt"); String[] ExcelFils = Directory.GetFiles(basePath, "*.xls"); Fils = Fils.Concat(ExcelFils).ToArray(); 
+6
Aug 2 '12 at 11:17
source share

If you can approximate the number of elements that will be there at the end, use the Constuctor List overload, which takes the quantity as a parameter. You save a few expensive duplicate listings. Otherwise, you will have to pay for it.

+4
Nov 20 '08 at 11:24
source share

You may not need to concatenate the final result into an adjacent array. Instead, add the list to the list suggested by John. As a result, you will have a jagged array (well, in fact, almost rectangular). When you need to access an element by index, use the following indexing scheme:

 double x = list[i / sampleSize][i % sampleSize]; 

Iterating over a jagged array is also simple:

 for (int iRow = 0; iRow < list.Length; ++iRow) { double[] row = list[iRow]; for (int iCol = 0; iCol < row.Length; ++iCol) { double x = row[iCol]; } } 

This saves your memory allocation and copying due to slightly slower access to the item. Whether this will increase network performance depends on the size of your data, data access patterns, and memory limitations.

+4
Nov 20 '08 at 18:49
source share

Here is a useful class based on what Konstantin said:

 class Program { static void Main(string[] args) { FastConcat<int> i = new FastConcat<int>(); i.Add(new int[] { 0, 1, 2, 3, 4 }); Console.WriteLine(i[0]); i.Add(new int[] { 5, 6, 7, 8, 9 }); Console.WriteLine(i[4]); Console.WriteLine("Enumerator:"); foreach (int val in i) Console.WriteLine(val); Console.ReadLine(); } } class FastConcat<T> : IEnumerable<T> { LinkedList<T[]> _items = new LinkedList<T[]>(); int _count; public int Count { get { return _count; } } public void Add(T[] items) { if (items == null) return; if (items.Length == 0) return; _items.AddLast(items); _count += items.Length; } private T[] GetItemIndex(int realIndex, out int offset) { offset = 0; // Offset that needs to be applied to realIndex. int currentStart = 0; // Current index start. foreach (T[] items in _items) { currentStart += items.Length; if (currentStart > realIndex) return items; offset = currentStart; } return null; } public T this[int index] { get { int offset; T[] i = GetItemIndex(index, out offset); return i[index - offset]; } set { int offset; T[] i = GetItemIndex(index, out offset); i[index - offset] = value; } } #region IEnumerable<T> Members public IEnumerator<T> GetEnumerator() { foreach (T[] items in _items) foreach (T item in items) yield return item; } #endregion #region IEnumerable Members System.Collections.IEnumerator System.Collections.IEnumerable.GetEnumerator() { return GetEnumerator(); } #endregion } 
+2
Nov 21 '08 at 14:55
source share

Olmo's offer is very good, but I would add: If you are not sure about the size, it is better to make it a little more than a little less. When the list is full, keep in mind that it will double its size to add more items.

For example: suppose you need about 50 elements. If you use a size of 50 elements and a finite number of elements is 51, you will end with a list of size 100 with 49 lost positions.

0
Nov 20 '08 at 14:47
source share



All Articles