Projects like distributed.net, SETI@Home, Entropia, and others, have attempted to apply the large number of computers available via the internet to perform massive computational tasks. The model of computation is that a problem is broken up into tasks, and each computer in the project requests a task from a central coordinator. When it finishes computing the task, it returns the results to the coordinator and requests a new task until the project is completed. In order to keep network bandwidth to a minimum, the description and results of a task should be short, and each task should take a moderate amount of time (say an hour or a day).
Which of the exponential-algorithm paradigms we've seen in this course (e.g. generate-and-test, backtracking, dynamic programming, random-walk, and quantum computing) would be suitable for application in such a distributed system? For each paradigm, explain how it could be partitioned into tasks or why it might be difficult to partition it into tasks.