Skip to main content

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index] [List Home]
[ptp-user] SDM: Parallel MPI debugging crash with Fortran derived types

Hello all,

I'm using Eclipse Luna for Parallel Aplication Developers and trying to debug a Fortran parallel aplication with it. I can compile and run the aplication without any problem but when i try to debug with SDM I allways get the following error:

*** Error in `/home/vsande/.eclipsesettings/sdm': double free or corruption (out): 0x0000000001543970 ***
--------------------------------------------------------------------------
mpirun noticed that process rank 1 with PID 18795 on node LSSC-T1700 exited on signal 6 (Aborted).
--------------------------------------------------------------------------

I'm running eclipse on a x86_64 Linux PC with Ubuntu 14.04:
Gfortran: gcc version 4.9.2 (Ubuntu 4.9.2-0ubuntu1~14.04)
MPI: (Open MPI) 1.8.4
GNU gdb (GDB) 7.8.2

My eclipse configuration:
/home/vsande/eclipse/jre/bin/java
-Dosgi.requiredJavaVersion=1.7
-XX:MaxPermSize=256m
-Xms512m
-Xmx2048m
-jar /home/vsande/eclipse//plugins/org.eclipse.equinox.launcher_1.3.0.v20140415-2008.jar
-os linux
-ws gtk
-arch x86_64
-showsplash /home/vsande/eclipse//plugins/org.eclipse.platform_4.4.0.v20140925-0400/splash.bmp
-launcher /home/vsande/eclipse/eclipse
-name Eclipse
--launcher.library /home/vsande/eclipse//plugins/org.eclipse.equinox.launcher.gtk.linux.x86_64_1.1.200.v20140603-1326/eclipse_1605.so
-startup /home/vsande/eclipse//plugins/org.eclipse.equinox.launcher_1.3.0.v20140415-2008.jar
--launcher.appendVmargs
-exitdata 564800d
-product org.eclipse.epp.package.parallel.product
-vm /home/vsande/eclipse/jre/bin/java
-vmargs
-Dosgi.requiredJavaVersion=1.7
-XX:MaxPermSize=256m
-Xms512m
-Xmx2048m
-jar /home/vsande/eclipse//plugins/org.eclipse.equinox.launcher_1.3.0.v20140415-2008.jar

It seems that there are any problem with Fortran derived types. In order to delimit the problem I try to reproduce the same problem in a simplest context, I did some minor changes in the Calculate Pi example adding a derived type variable in the header of calc_pi subroutine. I get the same problem!

Someone knows about this bug? I'm doing something wrong?

Following I paste my Calculate Pi source code (changes in bold):

! ============================================================================
! Name        : CalculatePi.f90
! Author      :
! Version     :
! Copyright   : Your copyright notice
! Description : Calculate Pi in MPI
! ============================================================================

module typedef
    type :: ddt
        integer :: num
    end type
end module


subroutine calc_pi(rank, num_procs, dummyarg)
    use mpi
    use typedef
    implicit none

    integer, intent(in) :: rank
    integer, intent(in) :: num_procs

    integer          :: i
    integer          :: ierror
    integer          :: num_intervals
    double precision :: h
    double precision :: mypi
    double precision :: pi
    double precision :: sum
    double precision :: x
   
    type(ddt), intent(inout) :: dummyarg
    print*, 'Dummyarg', num_procs, rank, dummyarg%num

    ! set number of intervals to calculate
    if (rank == 0) num_intervals = 100000000

    ! tell other tasks how many intervals
    call MPI_Bcast(num_intervals, 1, MPI_INTEGER, 0, MPI_COMM_WORLD, ierror)

    ! now everyone does their calculation

    h = 1.0d0 / num_intervals
    sum = 0.0d0

    do i = rank + 1, num_intervals, num_procs
        x = h * (i - 0.5d0);
        sum = sum + (4.0d0 / (1.0d0 + x*x))
    end do

    mypi = h * sum

    ! combine everyone's calculations
    call MPI_Reduce(mypi, pi, 1, MPI_DOUBLE_PRECISION, MPI_SUM, 0, &
        MPI_COMM_WORLD, ierror)

    if (rank == 0) print *, "PI is approximately ", pi
end subroutine

program mpi_pi_example
    use mpi
    use typedef
    implicit none

    integer, parameter :: LEN = 100               ! message length

    integer            :: ierror                  ! error code
    integer            :: my_rank                 ! rank of process
    integer            :: num_procs               ! number of processes
    integer            :: source                  ! rank of sender
    integer            :: dest                    ! rank of receiver
    integer            :: tag                     ! tag for messages
    character(len=LEN) :: message                 ! storage for message
    integer            :: status(MPI_STATUS_SIZE) ! return status for receive

    type(ddt) :: dummyarg

    dest = 0
    tag = 0

    ! start up MPI 
    call MPI_Init(ierror)

    ! find out process rank
    call MPI_Comm_rank(MPI_COMM_WORLD, my_rank, ierror)
    dummyarg%num = my_rank
    ! find out number of processes
    call MPI_Comm_size(MPI_COMM_WORLD, num_procs, ierror)


    if (my_rank .ne. 0) then
        ! create message
        write (message, *) "Greetings from process ", my_rank
        call MPI_Send(message, LEN, MPI_CHARACTER, &
                dest, tag, MPI_COMM_WORLD, ierror)
    else
        print *, "Num processes: ", num_procs
        do source = 1, num_procs-1
            call MPI_Recv(message, LEN, MPI_CHARACTER, source, tag, &
                    MPI_COMM_WORLD, status, ierror)
            print *, "Process 0 received ", message
        end do

        ! now return the compliment
        write (message, *) "Hi, how are you?"
    end if

    call MPI_Bcast(message, LEN, MPI_CHARACTER, dest, MPI_COMM_WORLD, ierror)

    if (my_rank .ne. 0) then
        print *, "Process ", my_rank, " received ", message
    end if

    ! calculate PI
    call calc_pi(my_rank, num_procs, dummyarg)

    ! shut down MPI
    call MPI_Finalize(ierror)

    stop
end program
 

Any help is appreciated.

Best regards,
Víctor.

Back to the top