Skip to main content

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index] [List Home]
[ptp-user] Output from unsuccessful attempts to run TAU from Eclipse

Dear Sir,

I'm replying to your prompts:

In the build/console output for your project, can you tell if your application is successfully built with TAU? 
 
As far as I can understand, yes it was. I got output...

**** Build of configuration Debug__tau-t1-mpi-pdt-profile-trace for project demo1 ****

make all
Building file: ../src/demo1.c
Invoking: GCC C Compiler
/mirror/mpiuser/t/i386_linux/bin/tau_cc.sh  -tau_options='-optPdtDir="/mirror/mpiuser/pdtoolkit3.16/i386_linux" -optMpi  -optPDTInst  ' -tau_makefile=/mirror/mpiuser/t/i386_linux/lib/Makefile.tau-t1-mpi-pdt-profile-trace -I/mirror/mpiuser/mpich2/include -I/mirror/mpiuser/mpich2/include -O0 -g3 -Wall -c -fmessage-length=0 -MMD -MP -MF"src/demo1.d" -MT"src/demo1.d" -o"src/demo1.o" "../src/demo1.c"

Finished building: ../src/demo1.c
 
Building target: demo1
Invoking: GCC C Linker
/mirror/mpiuser/t/i386_linux/bin/tau_cc.sh  -tau_options='-optPdtDir="/mirror/mpiuser/pdtoolkit3.16/i386_linux" -optMpi  -optPDTInst  ' -tau_makefile=/mirror/mpiuser/t/i386_linux/lib/Makefile.tau-t1-mpi-pdt-profile-trace -L/mirror/mpiuser/mpich2/lib -L/mirror/mpiuser/mpich2/lib -o"demo1"  ./src/demo1.o   -lmpich -lmpich

Finished building target: demo1

Which tau makefile/configuration did you select to build with?

In dialog 'Profile Configurations', tab 'TAU', tab 'Analysis options', combo 'Select Makefile', I used option: "Makefile.tau-t1-mpi-pdt-profile-trace".

I still get a "No profile data was located ..." messagebox, and then a "Could not generate ppk file from profile data" which sounds reasonable. My console gives also gives me the following output...

invalid "local" arg: -xusage:mpiexec [-h or -help or --help]    # get this messagempiexec -file filename             # (or -f) filename contains XML job descriptionmpiexec [global args] [local args] executable [args]   where global args may be      -l                           # line labels by MPI rank      -bnr                         # MPICH1 compatibility mode      -machinefile                 # file mapping procs to machines      -s <spec>                    # direct stdin to "all" or 1,2 or 2-4,6       -1                           # override default of trying 1st proc locally      -ifhn                        # network interface to use locally      -tv                          # run procs under totalview (must be installed)      -tvsu                        # totalview startup only      -gdb                         # run procs under gdb      -m                           # merge output lines (default with gdb)      -a                           # means assign this alias to the job      -ecfn                        # output_xml_exit_codes_filename      -recvtimeout <integer_val>   # timeout for recvs to fail (e.g. from mpd daemon)      -g<local arg name>           # global version of local arg (below)    and local args may be      -n <n> or -np <n>            # number of processes to start      -wdir <dirname>              # working directory to start in      -umask <umask>               # umask for remote process      -path <dirname>              # place to look for executables      -host <hostname>             # host to start on      -soft <spec>                 # modifier of -n value      -arch <arch>                 # arch type to start on (not implemented)      -envall                      # pass all env vars in current environment      -envnone                     # pass no env vars      -envlist <list of env var names> # pass current values of these vars      -env <name> <value>          # pass this value of this env varmpiexec [global args] [local args] executable args : [local args] executable...mpiexec -gdba jobid                # gdb-attach to existing jobidmpiexec -configfile filename       # filename contains cmd line segs as lines  (See User Guide for more details)Examples:   mpiexec -l -n 10 cpi 100   mpiexec -genv QPL_LICENSE 4705 -n 3 a.out   mpiexec -n 1 -host foo master : -n 4 -host mysmp slave

Thank you in advance for your support.

Respectfully,
Constantinos

Back to the top