"This process takes too much time, so I need to reduce this time."
This process consists of three subprocesses:
- Receive> 10 m records
- Writing Records to a File
- Transferring records over the network (my presumption is that you are working with a local client with a remote database).
Any or all of these problems can be a bottleneck. Thus, if you want to reduce the total time spent on what you need to find out where the time is spent. You probably need to measure the C # code to get the metrics.
If it turns out that this is a problem, then you will need to configure it. Indexes will not help here, since you are extracting a large piece of the table (> 10%), therefore, it will help increase the performance of a full table scan. For example, increasing memory to avoid disk sorting. A parallel query can be useful (if you have an Enterprise Edition and you have enough processors). Also check that the problem is not a hardware problem (spindle matching, ingenious interconnections, etc.).
Can there be a problem with writing to a file? Perhaps your disk is slow for some reason (for example, fragmentation), or perhaps you agree with other processes that are written to the same directory.
Transferring large amounts of data over the network is obviously a potential bottleneck. Are you sure that you send only ready-made data to the client?
Alternative architecture: use PL / SQL to write records to a file on the dataserver, using mass collection to retrieve managed batches of records, and then transfer the file to where you need it at the end, via FTP, possibly compressing it first.
APC
source share