由买买提看人间百态

topics

全部话题 - 话题: mpif90
(共0页)
k******y
发帖数: 1407
1
我试着compile一个大的MPI程序,先前已经安装了OpenMPI,调试小的MPI的test程序可以
但是现在调试:mpif90 -O3 xxx.f
[lots of outputs]
ZnSimA4.f(434): (col. 8) remark: LOOP WAS VECTORIZED.
ZnSimA4.f(436): (col. 8) remark: LOOP WAS VECTORIZED.
ZnSimA4.f(437): (col. 8) remark: LOOP WAS VECTORIZED.
ZnSimA4.f(442): (col. 14) remark: LOOP WAS VECTORIZED.
ZnSimA4.f(489): (col. 14) remark: LOOP WAS VECTORIZED.
ld: in /opt/intel/fce/10.1.017/lib/libimf.a(cbrt_gen.o),
ObjectFileAddressSpace::mappedAddress(0xFFFFFFFFFFFFFFFC) not in any section
l***a
发帖数: 149
2
初次在次发帖求助,如果内容不适合本版,请删除。
我在用一个apple pro os x 10.6 运行 fortran编写的数值模式。目前需要用到mpi的
library。我注意到之前在/usr/lib 和/usr/include 路径下都有openmpi的文件夹,
但是我一时冲动自己又安装了一个openmpi-1.4.3
fortran, gcc, g++都用的是gcc的package。
但是现在的问题是用mpif90编译程序后,每次执行都会遇到问题:
mca: base: component_find: unable to open /usr/lib/openmpi/mca_ras_dash_host
(ignored)
mca: base: component_find: unable to open /usr/lib/openmpi/mca_ras_
gridengine: perhaps a missing symbol, or compiled for a different version of
Open MPI? (ignored)
mca: base: compon... 阅读全帖
s*****l
发帖数: 167
3
来自主题: Computation版 - Re: f90 and mpi
sure... mpif90 or mpif95
S***y
发帖数: 186
4
来自主题: Computation版 - Re: f90 and mpi
mpif77, mpif90 ... are just so-called "wrapper" compiler.
They are practically not "real" compilers.
They need to call some background Fortran compilers to
carry out a complete compilation.
So, when installing the MPI, if the mpif77 is linked to
a Fortran90 compiler, such as pgf90 under Linux,
even mpif77 can compiler Fortran 90 programs in which
the MPI subroutines are called.
s*****l
发帖数: 167
5
来自主题: Computation版 - mpi_gather
I have two MPI_GATHER
command in my program,
one of which transfers md>1 real numbers from each processor
the other carries only 1,
the program can not pass the compilation...
mpif90 complains that in the second command, the 1st argument, viz. the
# of real variables is inconsistent....
which I set to be 1
anyone has seen such problem?
I had to change it to a set of send-recvs, pain in the neck
l******n
发帖数: 9344
6
I am saying ifort is slow from my own experience, and it depends on my
machine of course. For compiling the same program, pfg90 and mpif90 is much
faster than ifort for me.
mixing compilors is doable, but you need to figure out the settings and all
the flags. Different compiler has different default setting, so reading the
whole manual is must. There also might be something which has not a
counterpart in others, this is very bad.

gfortran,
have
x********g
发帖数: 47
7
来自主题: Computation版 - 用mpiifort 编译 出错
请高手指教 。 makefile如下:
# application name
APP = parallel_femsim
# list of source files
SRC = ran2.f90 globals.f90 model.f90 scattering_factors.f90 fem1.f90
parallel_femsim.f90
# list of object files
OBJ = ran2.o globals.o model.o scattering_factors.o fem1.o parallel_femsim
.o
# define libraries needed by the linker
#LIBS = -lmkl
# compiler options for debugging
FC_DEBUG = mpif90 -g -debug -implicitnone
# compiler options for optmized running
#FC_OPT = ifort -O3 -xO -ipo -no-prec-div -static
#F
(共0页)