Multiple Users Writing in the Same File

I have a project that is a web API project, several users are accessing my project (I really mean a lot of users). When my project is accessible from the external interface (web page using HTML 5) and the user does something like updating or receiving data, the backend application (web API) will write one log file (.log file, but the content - JSON). The problem is that when accessing multiple users, the interface becomes unresponsive (always loading). The problem is writing a log file process (a single log file, which really is accessed by a lot of users). I heard that using multi-threaded technology can solve the problem, but I don't know which method. So maybe someone can help me. Here is my code (sorry if a typo, I use my smartphone and mobile version):

public static void JsonInputLogging<T>(T m, string methodName) { MemoryStream ms = new MemoryStream(); DataContractJsonSerializer ser = new DataContractJsonSerializer(typeof(T)); ser.WriteObject(ms, m); string jsonString = Encoding.UTF8.GetString(ms.ToArray()); ms.Close(); logging("MethodName: " + methodName + Environment.NewLine + jsonString.ToString()); } public static void logging (string message) { string pathLogFile = "D:\jsoninput.log"; FileInfo jsonInputFile = new FileInfo(pathLogFile); if (File.Exists(jsonInputFile.ToString())) { long fileLength = jsonInputFile.Length; if (fileLength > 1000000) { File.Move(pathLogFile, pathLogFile.Replace(*some new path*); } } File.AppendAllText(pathLogFile, *some text*); } 
+7
multithreading c #
source share
3 answers

First you need to understand some internal elements. For each user [x], ASP.Net will use one workflow. One workflow contains multiple threads. If you use multiple instances in the cloud, this is even worse, because then you also have multiple instances of the server (I assume this is not the case).

A few issues here:

  • You have multiple users and therefore multiple threads.
  • Multiple threads can lock each other by writing files.
  • You have several application domains and, therefore, several processes.
  • Several processes can block each other.

Opening and locking files

File.Open has several flags to block. Basically, you can block files exclusively for each process, which is a good idea in this case. The two-step approach with Exists and Open will not help, because there can be something between the other workflow. Basically, the idea is to call Open with exclusive write access, and if it doesn't work, try again with a different file name.

This basically solves the problem with multiple processes.

Recording from multiple streams

File access is single-threaded. Instead of writing material to a file, you can use a separate stream to access the file and several streams that tell you what to write.

If you have more journal requests than you can handle, you end up in the wrong zone anyway. In this case, the best way to handle this for registering an IMO is to simply drop the data. In other words, make the registrar somewhat lost in order to make life better for your users. You can also use a queue for this.

I usually use a ConcurrentQueue for this and a separate thread that deletes all the logged data.

This is basically how to do it:

 // Starts the worker thread that gets rid of the queue: internal void Start() { loggingWorker = new Thread(LogHandler) { Name = "Logging worker thread", IsBackground = true, Priority = ThreadPriority.BelowNormal }; loggingWorker.Start(); } 

We also need to do something for the actual work and some common variables:

 private Thread loggingWorker = null; private int loggingWorkerState = 0; private ManualResetEventSlim waiter = new ManualResetEventSlim(); private ConcurrentQueue<Tuple<LogMessageHandler, string>> queue = new ConcurrentQueue<Tuple<LogMessageHandler, string>>(); private void LogHandler(object o) { Interlocked.Exchange(ref loggingWorkerState, 1); while (Interlocked.CompareExchange(ref loggingWorkerState, 1, 1) == 1) { waiter.Wait(TimeSpan.FromSeconds(10.0)); waiter.Reset(); Tuple<LogMessageHandler, string> item; while (queue.TryDequeue(out item)) { writeToFile(item.Item1, item.Item2); } } } 

Basically, this code allows you to remove all elements from a single thread using a queue shared by threads. Note that ConcurrentQueue does not use locks for TryDequeue , so clients will not feel any pain because of this.

The last thing you need is to add material to the queue. This is the easy part:

 public void Add(LogMessageHandler l, string msg) { if (queue.Count < MaxLogQueueSize) { queue.Enqueue(new Tuple<LogMessageHandler, string>(l, msg)); waiter.Set(); } } 

This code will be called from multiple threads. This is not 100% correct, because Count and Enqueue need not be called sequentially, but for our purposes and goals this is good enough. It also does not block in Enqueue , and waiter ensures that the material is removed by another stream.

Wrap it all in a singleton pattern, add some more logic to it, and your problem should be solved.

+5
source share

This can be problematic, as each client request is processed in a new way by default anyway. You need some kind of β€œroot” object that is known throughout the project (do not think that you can achieve this in a static class), so you can lock it before accessing the log file. However, note that this will basically serialize queries and will probably have a very bad effect on performance.

+1
source share

No multithreading solves your problem. How should multiple threads write to the same file at the same time? You will need to take care of the consistency data, and I don't think the problem is here.

What you are looking for is asynchronous programming. The reason your GUI becomes unresponsive is because it waits for tasks to complete. If you know the registrar is your bottleneck, then use async to your advantage. Fire the log method and forget about the results, just write the file.

In fact, I really do not think that the problem is with your registrar. Are you sure there is no other logic that blocks you?

+1
source share

All Articles