Is multi-threading hard and unnecessary?

From Slashdot | More Interest In Parallel Programming Outside the US?

Threads: Threat or Menace (Score:5, Insightful)

by martincmartin (1094173) on Tuesday March 25 2008, @08:21AM (#22855576)

It always surprizes me how many people say “we have to multithread our code, because computer are getting more cores,” not realizing:

  1. There are often other ways to do it, e.g. multiple processes communicating over sockets, or multiple processes that share memory.
  2. Threads are hard to get right. Really, really hard. []

When your library of mutexes, semaphores, etc. doesn’t have exactly the construct you need, and you go to write your own on top of them, it’s really, really hard not to introduce serious bugs that only show up very rarely. As one random example, consider the Linux kernel team’s attempts to write a mutex, as descried in Ulrich Drepper’s paper “Futexes are Tricky.” []

If these people take years to get it right, what makes you think *you* can get it right in a reasonable time?

The irony is that threads are only practical (from a correctness/debugging point of view) when there isn’t much interaction between the threads.

By the way, I got that link from Drepper’s excellent “What Every Programmer Should Know about Memory.” [] It also talks about how threading can slow things down.

Re:Duh? (Score:5, Interesting)

by bit01 (644603) on Tuesday March 25 2008, @08:16AM (#22855538)

I work in parallel programming too.

Most problems do not parallelize to large scales.

I’m getting tired of this nonsense being propagated. Almost all real world problems parallelize just fine, and to a scale sufficient to solve the problem with linear speedup. It’s only when people look at a narrow class of toy problems and artificial restrictions that parallelism “doesn’t apply”. e.g. Look at google; it searches the entire web in milliseconds using a large array of boxes. Even machine instructions are being processed in parallel these days (out of order execution etc.).

Name a single real world problem that doesn’t parallelize. I’ve asked this question on slashdot on several occasions and I’ve never received a positive reply. Real world problems like search, FEA, neural nets, compilation, database queries and weather simulation all parallelize well. Problems like orbital mechanics don’t parallelize as easily but then they don’t need parallelism to achieve bounded answers in faster than real time.

Note: I’m not talking about some problems being intrinsically hard (NP complete etc.), many programmers seem to conflate “problem is hard” with “problem cannot be parallelized”. Some mediocre programmers also seem to regard parallel programming as voodoo and are oblivious to the fact that they are typically programming a box with dozens of processors in it (keyboard, disk, graphics, printer, monitor etc.). Some mediocre programmers also claim that because a serial programming language cannot be automatically parallelized that means parallelism is hard. Until we can program in a natural language that just means they’re not using a parallel programming language appropriate for their target.

Advertising pays for nothing. Who do you think pays marketer’s salaries? You do via higher cost products.

[Slashdot] [Digg] [Reddit] [] [Facebook] [Technorati] [Google] [StumbleUpon]

Comments are closed.