Spent last couple days getting a libhdfs dependent project working on Solaris. Essentially this package which is part of Hadoop core allows you to interact with HDFS within your C program using Java JNI.

These instructions are for building 64bit libhdfs libraries on Solaris AMD64.

Step 1

Get a copy of Hadoop on your Solaris system.

Odds are you will be behind a firewall. Suggest you procure a Linux system outside of the firewall, install Squid proxy, and then ssh from your Linux system to your Solaris system forwarding port 3128. Setup proxy variables on the Solaris system. You will need external HTTP to satisfy Hadoop dependencies from Ivy.

export http_proxy=127.0.0.1:3128
export ANT_OPTS="-Dhttp.proxyHost=127.0.0.1 -Dhttp.proxyPort=3128"

You will also need some other GNU packages to move forward. I used m4-1.4.15, automake-1.11, and autoconf-2.68. For these I downloaded source, unpacked, configure –prefix=$HOME, and make install where ~/bin is in my path. Lastly you need 64bit JDK (include, jre/lib/amd64/server) where JAVA_HOME is set accordingly. Oh yea, you need ant too.

Step 2

Here we will start a build that will eventually fail.

Unpackage your distribution and move into its base directory.

ant -Divy.checksums="" compile-c++-libhdfs -Dlibhdfs=1 -Dcompile.c++=1 -Dneed.libhdfs.configure=1

Expect to see some Ivy output then a configure start and lastly a build fail.

Step 3

Tweak the configuration to build.

cd src/c++/libhdfs
export JVM_ARCH=64
CC=cc ./configure

Next you need to modify the Makefile.

Change1: Find this "-Wl,-x" and replace with "-Wl -x"
Change2: delete all the -L entries for JVM lib directories except the one that has amd64/server in it

Lastly you are ready to build

make

Step 4

You will find the libraries here:

cd src/c++/libhdfs/.libs

Step 5

To use the library you built you will need:

  • The headers in src/c++/libhdfs,
  • The library you built -L<directory> and -lhdfs
  • The JVM library -L<JAVA_HOME>/jre/amd64/server and -ljvm
About these ads