Skip to main content

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index] [List Home]
Re: [ptp-user] Problem with running MPI application in PTP

Mario,

Did you recompile the program with the new version of Open MPI? Make sure you're picking up the correct mpicc command.

Regards,
Greg

On Mar 10, 2010, at 4:41 AM, Mario Ogrizek wrote:

OK, i did all the changes..But still no avail..

I ran a simulation for four processes, and this is the console output:

[Mario-Ogrizek-Tomass-MacBook-Pro.local:95041]  Map for job: 0 Generated by mapping mode: byslot
  Starting vpid: 0 Vpid range: 1 Num app_contexts: 1
  Data for app_context: index 0 app: unknown
  Num procs: 1
  Working dir: (null) (user: 108)
  Num maps: 0
  Num elements in nodes list: 1
  Mapped node:
  Cell: 0 Nodename: Mario-Ogrizek-Tomass-MacBook-Pro.local Launch id: -1 Username: NULL
  Daemon name:
Data type: ORTE_PROCESS_NAME Data Value: NULL
  Oversubscribed: False Num elements in procs list: 1
  Mapped proc:
  Proc Name:
  Data type: ORTE_PROCESS_NAME Data Value: [0,0,0]
  Proc Rank: 0 Proc PID: 0 App_context index: 0
Hello MPI World From process 0: Num processes: 1
[Mario-Ogrizek-Tomass-MacBook-Pro.local:95043]  Map for job: 0 Generated by mapping mode: byslot
  Starting vpid: 0 Vpid range: 1 Num app_contexts: 1
  Data for app_context: index 0 app: unknown
  Num procs: 1
  Working dir: (null) (user: 108)
  Num maps: 0
  Num elements in nodes list: 1
  Mapped node:
  Cell: 0 Nodename: Mario-Ogrizek-Tomass-MacBook-Pro.local Launch id: -1 Username: NULL
  Daemon name:
Data type: ORTE_PROCESS_NAME Data Value: NULL
  Oversubscribed: False Num elements in procs list: 1
  Mapped proc:
  Proc Name:
  Data type: ORTE_PROCESS_NAME Data Value: [0,0,0]
  Proc Rank: 0 Proc PID: 0 App_context index: 0
[Mario-Ogrizek-Tomass-MacBook-Pro.local:95042]  Map for job: 0 Generated by mapping mode: byslot
  Starting vpid: 0 Vpid range: 1 Num app_contexts: 1
  Data for app_context: index 0 app: unknown
  Num procs: 1
  Working dir: (null) (user: 108)
  Num maps: 0
  Num elements in nodes list: 1
  Mapped node:
  Cell: 0 Nodename: Mario-Ogrizek-Tomass-MacBook-Pro.local Launch id: -1 Username: NULL
  Daemon name:
Data type: ORTE_PROCESS_NAME Data Value: NULL
  Oversubscribed: False Num elements in procs list: 1
  Mapped proc:
  Proc Name:
  Data type: ORTE_PROCESS_NAME Data Value: [0,0,0]
  Proc Rank: 0 Proc PID: 0 App_context index: 0
Hello MPI World From process 0: Num processes: 1
[Mario-Ogrizek-Tomass-MacBook-Pro.local:95044]  Map for job: 0 Generated by mapping mode: byslot
  Starting vpid: 0 Vpid range: 1 Num app_contexts: 1
  Data for app_context: index 0 app: unknown
  Num procs: 1
  Working dir: (null) (user: 108)
  Num maps: 0
  Num elements in nodes list: 1
  Mapped node:
  Cell: 0 Nodename: Mario-Ogrizek-Tomass-MacBook-Pro.local Launch id: -1 Username: NULL
  Daemon name:
Data type: ORTE_PROCESS_NAME Data Value: NULL
  Oversubscribed: False Num elements in procs list: 1
  Mapped proc:
  Proc Name:
  Data type: ORTE_PROCESS_NAME Data Value: [0,0,0]
  Proc Rank: 0 Proc PID: 0 App_context index: 0
Hello MPI World From process 0: Num processes: 1
Hello MPI World From process 0: Num processes: 1


Also, i tried running from shell, and there is a change..(its a same type of program, different output)
working with mpirun 1.4.1 i get this:
(this seems to be wrong)

Mario$ mpirun -np 4 MPI_test 
Hello, world, I am 0 of 1 
Hello, world, I am 0 of 1 
Hello, world, I am 0 of 1 
Hello, world, I am 0 of 1



while working with older mpirun i get:

Mario$ mpirun -np 4 MPI_test 
Hello, world, I am 0 of 4
Hello, world, I am 1 of 4
Hello, world, I am 2 of 4
Hello, world, I am 3 of 4

now i changed the path of the resources manager to /usr/bin
and, i got a correct output:

[Mario-Ogrizek-Tomass-MacBook-Pro.local:95346]  Map for job: 1 Generated by mapping mode: byslot
  Starting vpid: 0 Vpid range: 4 Num app_contexts: 1
  Data for app_context: index 0 app: /Users/Mario/Documents/workspaceCPP/MyMPIproject/Debug/MyMPIproject
  Num procs: 4
  Argv[0]: /Users/Mario/Documents/workspaceCPP/MyMPIproject/Debug/MyMPIproject
  Env[0]: OMPI_MCA_rmaps_base_display_map=1
  Env[1]: OMPI_MCA_orte_precondition_transports=a298d1a8380875f5-ad12d348a3cd0acd
  Env[2]: OMPI_MCA_rds=proxy
  Env[3]: OMPI_MCA_ras=proxy
  Env[4]: OMPI_MCA_rmaps=proxy
  Env[5]: OMPI_MCA_pls=proxy
  Env[6]: OMPI_MCA_rmgr=proxy
  Working dir: /Users/Mario/Documents/workspaceCPP/MyMPIproject/Debug (user: 0)
  Num maps: 0
  Num elements in nodes list: 1
  Mapped node:
  Cell: 0 Nodename: Mario-Ogrizek-Tomass-MacBook-Pro.local Launch id: -1 Username: NULL
  Daemon name:
Data type: ORTE_PROCESS_NAME Data Value: NULL
  Oversubscribed: True Num elements in procs list: 4
  Mapped proc:
  Proc Name:
  Data type: ORTE_PROCESS_NAME Data Value: [0,1,0]
  Proc Rank: 0 Proc PID: 0 App_context index: 0

  Mapped proc:
  Proc Name:
  Data type: ORTE_PROCESS_NAME Data Value: [0,1,1]
  Proc Rank: 1 Proc PID: 0 App_context index: 0

  Mapped proc:
  Proc Name:
  Data type: ORTE_PROCESS_NAME Data Value: [0,1,2]
  Proc Rank: 2 Proc PID: 0 App_context index: 0

  Mapped proc:
  Proc Name:
  Data type: ORTE_PROCESS_NAME Data Value: [0,1,3]
  Proc Rank: 3 Proc PID: 0 App_context index: 0
Hello MPI World From process 0: Num processes: 4
Hello MPI World from process 1!
Hello MPI World from process 2!
Hello MPI World from process 3!

Im not sure if this red part is OK.

So, now im confused, older one seems to be correct.



On Wed, Mar 10, 2010 at 2:08 AM, Greg Watson <g.watson@xxxxxxxxxxxx> wrote:
Yes, I think 1.2.8 is the version shipped with SL. 

By default it will install in /usr/local which is probably ok. You just need to make sure the /usr/local/bin is in your path ahead of /usr/bin by editing your .bash_profile. This will allow you to use the new Open MPI when you build/run from the command line. 

For PTP, you need to do either of the following:

1) On the resource manager Open MPI tool configuration wizard page, uncheck "Use default location" and enter /usr/local/bin into the "Location" field. You will also need to modify your Makefile so that it picks up mpicc from /usr/local/bin rather than /usr/bin; or

2) Create a directory call .MacOSX (case is important) in your home directory. Create a file in this directory called "environment.plist" with the following contents:

<?xml version="1.0" encoding="UTF-8"?>
<!DOCTYPE plist PUBLIC "-//Apple Computer//DTD PLIST 1.0//EN" "http://www.apple.com/DTDs/PropertyList-1.0.dtd">
<plist version="1.0">
<dict>
        <key>PATH</key>
        <string>/usr/local/bin:/bin:/sbin:/usr/bin:/usr/sbin:/sbin:/usr/sbin</string>
</dict>
</plist>

Restart Eclipse and it should now pick up everything from the correct path.

Regards,
Greg


On Mar 9, 2010, at 7:49 PM, Mario Ogrizek wrote:

I did download a 1.4.1 version, and installed it.
Is it possible, that my current version is from the default open MPI that was shipped with OS X?

Im not particulary skilled with "shell-work" and am not sure how to switch/upgrade/remove the old version. :(

On Wed, Mar 10, 2010 at 1:36 AM, Greg Watson <g.watson@xxxxxxxxxxxx> wrote:
Mario,

I presume you're using PTP 3.0.1. 

Open MPI 1.2.8 is a very old version. It's possible that something is wrong with parsing the output (your program may actually be running correctly). Would it be possible to update to the latest version Open MPI 1.4.1?

Greg

On Mar 9, 2010, at 7:10 PM, Mario Ogrizek wrote:

Hi,
I've done some extensive research on my problem, and couldnt find a solution.


Problem is:
Im trying to run a simple MPI Hello world app...
In console, it works great..
I just type "mpirun -np 4 MPiHelloWorld" and all is ok.

But in eclipse, after i set up the whole PTP environment,
and start the program with 2 processes, i get a red output in console:

[Mario-Ogrizek-Tomass-MacBook-Pro.local:94363]  Map for job: 0 Generated by mapping mode: byslot
  Starting vpid: 0 Vpid range: 1 Num app_contexts: 1
  Data for app_context: index 0 app: unknown
  Num procs: 1
  Working dir: (null) (user: 108)
  Num maps: 0
  Num elements in nodes list: 1
  Mapped node:
  Cell: 0 Nodename: Mario-Ogrizek-Tomass-MacBook-Pro.local Launch id: -1 Username: NULL
  Daemon name:
Data type: ORTE_PROCESS_NAME Data Value: NULL
  Oversubscribed: False Num elements in procs list: 1
  Mapped proc:
  Proc Name:
  Data type: ORTE_PROCESS_NAME Data Value: [0,0,0]
  Proc Rank: 0 Proc PID: 0 App_context index: 0
Hello MPI World From process 0: Num processes: 1
[Mario-Ogrizek-Tomass-MacBook-Pro.local:94362]  Map for job: 0 Generated by mapping mode: byslot
  Starting vpid: 0 Vpid range: 1 Num app_contexts: 1
  Data for app_context: index 0 app: unknown
  Num procs: 1
  Working dir: (null) (user: 108)
  Num maps: 0
  Num elements in nodes list: 1
  Mapped node:
  Cell: 0 Nodename: Mario-Ogrizek-Tomass-MacBook-Pro.local Launch id: -1 Username: NULL
  Daemon name:
Data type: ORTE_PROCESS_NAME Data Value: NULL
  Oversubscribed: False Num elements in procs list: 1
  Mapped proc:
  Proc Name:
  Data type: ORTE_PROCESS_NAME Data Value: [0,0,0]
  Proc Rank: 0 Proc PID: 0 App_context index: 0
Hello MPI World From process 0: Num processes: 1


And in process info i get "Job 0:0" and "Job 0:1" which both have the same program output: "Hello MPI World From process 0: Num processes: 1"


There seems to be some kind of problem with orte process, thats my guess.

Important thing to mention, im running this on OS X 10.6 Snow Leopard.
And i first installed mpich, and after that open mpi, which i decided to use.

Also, mpirun version is: mpirun (Open MPI) 1.2.8


Pls help me!
_______________________________________________
ptp-user mailing list
ptp-user@xxxxxxxxxxx
https://dev.eclipse.org/mailman/listinfo/ptp-user


_______________________________________________
ptp-user mailing list
ptp-user@xxxxxxxxxxx
https://dev.eclipse.org/mailman/listinfo/ptp-user


_______________________________________________
ptp-user mailing list
ptp-user@xxxxxxxxxxx
https://dev.eclipse.org/mailman/listinfo/ptp-user


_______________________________________________
ptp-user mailing list
ptp-user@xxxxxxxxxxx
https://dev.eclipse.org/mailman/listinfo/ptp-user


_______________________________________________
ptp-user mailing list
ptp-user@xxxxxxxxxxx
https://dev.eclipse.org/mailman/listinfo/ptp-user


Back to the top