Executing a Haskell executable with a static library written in C ++ received `undefined reference`

I created a static library:

// foo.h extern "C" { int foo (const char* arg0, int arg1); } // foo.cpp #include "foo.h" // implementation of foo 

This block of code was compiled in foo.o and packaged in libfoo.a , which was installed in MinGW lib dir (I am on Windows using the GCC toolchain).

What I want to do is wrap this function in Haksell code, so a typical FFI binding looks like this:

 -- Foo.hs {-# LANGUAGE ForeignFunctionInterface #-} module Foo where foreign import ccall "foo" c_foo :: CString -> CInt -> IO (CInt) 

extra-libraries been added to the .cabal file:

 ... extra-libraries: foo, stdc++ 

But GHC comments on the undefined link to foo :

 .dist-scion\build\path\to\Foo.o:fake:(.text+0x514): undefined reference to `foo' 

After the nm library, to find the foo function actually existing in the library (with some decorations on the name), I'm stuck here ...


[EDIT]
I also tried creating a haskell package using cabal:

 cabal configure cabal build 

Result:

 Loading object (dynamic) foo ... ghc.exe: foo: .... <command line>: user specified .o/.so/.DLL could not be loaded (addDLL: could not load DLL) 

So, should this be related to static / dynamic linking? because I noticed that the GHC wants to download .o/.so/.DLL , but NOT .a . I'm really confused.


finally got something on the wiki: Cxx external function interface


[EDIT]
One solution is to use -optl-lfoo -optl-lstdc++ in the .cabal file rather than extra-libraries . And the naming problem can be easily solved by wrapping the declaration in extern "C" :

 #ifdef __cplusplus extern "C" { #endif extern int foo (const char*, int); #ifdef __cplusplus } #endif 

This works in EclipseFP because it uses Scion. But it still does not work on cabal build .

+4
source share

All Articles