[lnkForumImage]
TotalShareware - Download Free Software

Confronta i prezzi di migliaia di prodotti.
Asp Forum
 Home | Login | Register | Search 


 

Forums >

comp.lang.python

running a program on many processors

Pawel Banys

3/8/2010 12:18:00 AM

Hello,

I have already read about Python and multiprocessing which allows using
many processors. The idea is to split a program into separate tasks and
run each of them on a separate processor. However I want to run a Python
program doing a single simple task on many processors so that their
cumulative power is available to the program as if there was one huge
CPU instead of many separate ones. Is it possible? How can it be achieved?

Best regards,

PaweÅ?
6 Answers

Diez B. Roggisch

3/8/2010 12:29:00 AM

0

Am 08.03.10 01:18, schrieb PaweÅ? BanyÅ?:
> Hello,
>
> I have already read about Python and multiprocessing which allows using
> many processors. The idea is to split a program into separate tasks and
> run each of them on a separate processor. However I want to run a Python
> program doing a single simple task on many processors so that their
> cumulative power is available to the program as if there was one huge
> CPU instead of many separate ones. Is it possible? How can it be achieved?


That's impossible to answer without knowing anything about your actual
task. Not everything is parallelizable, or algorithms suffer from
penalties if parallelization is overdone.

So in essence, what you've read already covers it: if your "simple task"
is dividable in several, independent sub-tasks that don't need
serialization, multiprocessing is your friend.

Diez

Gib Bogle

3/8/2010 12:49:00 AM

0

PaweÅ? BanyÅ? wrote:
....
How can it be achieved?

Very carefully.

Steven D'Aprano

3/8/2010 1:09:00 AM

0

On Mon, 08 Mar 2010 01:18:13 +0100, PaweÅ? BanyÅ? wrote:

> Hello,
>
> I have already read about Python and multiprocessing which allows using
> many processors. The idea is to split a program into separate tasks and
> run each of them on a separate processor. However I want to run a Python
> program doing a single simple task on many processors so that their
> cumulative power is available to the program as if there was one huge
> CPU instead of many separate ones. Is it possible? How can it be
> achieved?

Try Parallel Python.

http://www.parallelp...

I haven't used it, but it looks interesting.

However, the obligatory warning against premature optimization: any sort
of parallel execution (including even lightweight threads) is hard to
build and much harder to debug. You should make sure that the potential
performance benefits are worth the pain before you embark on the job: are
you sure that the naive, single process version isn't fast enough?


--
Steven

Martin P. Hellwig

3/8/2010 2:09:00 AM

0

On 03/08/10 00:18, PaweÅ? BanyÅ? wrote:
> Hello,
>
> I have already read about Python and multiprocessing which allows using
> many processors. The idea is to split a program into separate tasks and
> run each of them on a separate processor. However I want to run a Python
> program doing a single simple task on many processors so that their
> cumulative power is available to the program as if there was one huge
> CPU instead of many separate ones. Is it possible? How can it be achieved?
>
> Best regards,
>
> PaweÅ?

As far as I know, the Python VM (cpython) will not analyze your code and
automatically spread parts over different processing units.

I did read, two years or so ago, that AMD was looking in to something
that does just what you say on a cpu level, that is present itself as
one logical cpu but underneath there are multiple physical ones. I
wouldn't hold my breath though waiting for it.

--
mph

Stefan Behnel

3/8/2010 7:43:00 AM

0

Martin P. Hellwig, 08.03.2010 03:08:
> I did read, two years or so ago, that AMD was looking in to something
> that does just what you say on a cpu level, that is present itself as
> one logical cpu but underneath there are multiple physical ones. I
> wouldn't hold my breath though waiting for it.

Many (desktop/server) CPUs actually do the opposite today - they present
themselves as one physical CPU per core with more than one (commonly two)
logical CPUs. This was introduced because modern CPUs have so many
independent parts (integer arithmetic, floating point, SSE, memory access)
that it's hard to keep all of them busy with a single process (which
usually does either integer arithmetic *or* floating point, for example,
rarely both in parallel). With multiple processes running on the same core,
it becomes a lot easier to find independent operations that can be sent to
different parts of the core in parallel.

Automatically splitting single-threaded code over multiple cores is
something that compilers (that see the full source code) should be able to
do a lot better than hardware (which only sees a couple of basic operations
at a time).

http://en.wikipedia.org/wiki/Vectorization_%28computer_...

Expecting this to work for an interpreted Python program is somewhat
unrealistic, IMHO. If you need data parallel execution, use something like
map-reduce or Copperhead instead of relying on the CPU to figure out what's
happening inside of a virtual machine.

http://fperez.org/py4science/ucb/talks/20091118_copperhead_bcat...

Stefan

Processor-Dev1l

3/8/2010 9:27:00 AM

0

On Mar 8, 1:18 am, Pawel Banys <moc.li...@synabp.reverse_the_string>
wrote:
> Hello,
>
> I have already read about Python and multiprocessing which allows using
> many processors. The idea is to split a program into separate tasks and
> run each of them on a separate processor. However I want to run a Python
> program doing a single simple task on many processors so that their
> cumulative power is available to the program as if there was one huge
> CPU instead of many separate ones. Is it possible? How can it be achieved?
>
> Best regards,
>
> Pawel

I can suggest you to try some .NET language (like c#, Boo <-- python-
like, or maybe even IronPython). Reason is that .NET languages do
analyze the code and split them into logical parts that run across
threads.