Does hyperthreading increase to unstable systems?

I am building a PC with a new quad-core Intel I7 processor. When hyperthreading is enabled, it will report 8 cores in the task manager.

Some of my colleagues say that hyperthreading will make the system unreliable and suggest disabling it.

Can any of you good people enlighten me and other users in the warehouse.

Follow on: I use hyperthreads all the time, and that was in place. No instability at all. I use:

  • Microsoft Server 2008 64 bit
  • Microsoft SQL Server 2008 64 bit
  • Microsoft Visual Studio 2008
  • Diskeeper Server
  • Many controls (Telerik, Dundas, Rebex, Resharper)
+6
intel hyperthreading
source share
15 answers

Stability is unlikely to be affected, since abstraction is very low, and the OS just sees it as another processor to ensure operation. However, performance is another thing.

Honestly, I can’t say if this is still the case, but at least when the processors with HT support first appeared, problems were known, at least with some applications. It is known that, for example, MySQL and multi-threaded applications, such as the Java application that I support for my daily work, have reduced performance when HT is enabled. We always recommended to remove it, at least for our specific case of using a server-side corporate application.

Perhaps this is no longer a problem, and in a desktop environment this is unlikely to be a problem for most use cases. The possibility of dividing work on the CPU as a whole would lead to more flexible applications when the processor is heavily used. However, context switching and overhead can be defragmented when the application is already heavily ceiling and CPU intensive, for example, in the case of a database server.

+9
source share

From head to toe, I can think of several reasons why your colleagues can say this.

  • A few articles on SQL performance suffering from hyperthreading. I believe this leads to too much context switching or cache bypass. I do not remember exactly.

  • At the beginning of the transition from one process to several processes, or, most likely, for most people treated with hyperthreads, many problems of streaming were discovered. Race conditions, dead ends, etc., which they have never seen before. Even though the problem with the code caused some accusations of procs.

Are they the same statements about multi-core / multi-process or just about hyper-threads?

As for me, I’ve been developing in a hypertext box for 4 years now, the only problem is the problem of interlocking the user interface of my own creation.

+4
source share

Hyperthreading will mainly affect the behavior / performance of the scheduler when sending threads to the same processor, unlike different CPUs ...

It will be displayed in a poorly encoded application that does not handle race conditions between threads ...

So this is usually bad design / code .... that suddendly find a failure mode condition

+3
source share

to act? I doubt. The only drawback of the hyperthread that I can think of is the fact that if the OS is not aware of this , it can schedule two threads on the same physical processor when other physical processors are idle, which will degrade performance.

+2
source share

There was a problem with the SQL server and hyper-thread for some queries, because the SQL server has its own scheduler, maxdop 1 would solve this

+2
source share

To some extent, Windows is unstable, it is unlikely that hyperthreading makes a significant contribution (or that would make great news by now.)

+2
source share

I had a computer with hyperthread for a couple of years. Not many cores, but it worked for me.

Wish I had test data to prove that your colleagues are wrong, but that seems to be my opinion against them at the moment.;)

+1
source share

Threads in a processor with a hyperthread use the same cache and, as such, do not suffer from cache consistency problems that an architecture with multiple processors may have. Although, if the developer of a piece of software programs with several processors in mind, they will (or should) write with reading semantics (iirc, this term). those. All entries are immediately deleted from the cache.

+1
source share

As far as I know, from the point of view of the OS, he does not see a hyperthread, since it differs from the actual several cores. From the point of view of the OS, there is no difference - it is isolated.

So, in addition to the fact that hyper-threads of "extra cores" are not "real" (in a strictly technical sense) and do not have the full performance of "real" processor cores, I do not see this to be less reliable. Slower, perhaps in some rare cases, but no less reliable.

Of course, it depends on what you use. I believe that some applications may be "empty" and "dirty" with the processor, and hyperthreading can confuse them, but this is probably quite rare.

I’ve been running a hyper-threading computer for several years now and I haven’t seen stability issues.

Sorry, I do not have more specific data!

+1
source share

I have an i7 system and I had no problems.

If it works with multiple cores, it works with hyperthreading.

+1
source share

The short answer is yes.

The long answer, like almost every question, is "it depends." Depends on OS, software, processor revision, etc. I personally had to disable hyperthreading in two cases in order to ensure the correct functioning of the software (one with the Synergy application and two with the Windows NT 4.0 installer), but your mileage may vary.

As long as you install Windows with the definition of several HT cores from the very beginning (it loads some relevant drivers, etc.), you can always disable (and enable again) HT "after the fact". If you have strange security issues with certain software that you cannot solve, it is easy to disable HT to see if it has any effect.

I would not turn it off because, frankly, it will probably work fine in 99.99% of your daily use. But keep in mind that yes, this can sometimes lead to strange behavior, so do not rule out if you manage to eliminate something very strange in the future.

0
source share

Personally, I found that hyperthreading, although it does not cause any problems, actually does not help either. It might be like an extra .1 processor. On my HT machine at work, I only very rarely see my processor go above 50%. I don’t know if XT has improved with new processors like i7, but I'm not optimistic.

0
source share

With the exception of a few reports about SQL Server, all I can report is positive. I get 25% better performance in heavy multi-threaded applications with HT support. Never run into a problem and I use the first generation HT processor ...

0
source share

Late to the party, but for future reference;

I am currently having a problem with SQLServer. Basically, I understand that Hyperthreading on the same processor has the same L1 and L2 cache, which can cause problems between them. Citrix also has this issue from what I'm reading.

Slava Ok wrote a good blog post on it.

0
source share

I am here very late, but found this page through Google. Perhaps I discovered a very subtle problem. I have an i7 950 server running on a 2003 server, and that's great. I initially left hyperthreading in the BIOS, but during some tests and tough actions, I ran a program called "crashme" from Carrette. This program tries to minimize the OS without creating a process and feeding it garbage to try and run. My dual Opteron installation ran it forever without problems, but the 950 crashed in an hour. It didn’t work for anything else unless I did something stupid, so it was very amazing. On a whim, I turned off HT and started the program again. It works all night, even a few copies. One joke doesn't mean much, but try and see what happens. In addition, it seems that the processor is slightly colder at any given load if HT is off. YMMV.

0
source share

All Articles