Grails on Openshift

I’ve been playing with Openshift for a week or two now, and I’ve really liked what I’ve seen.  SSH access, git repo usage, cartridges, built-in jenkins capabilities, there’s a lot of great things there.  What I haven’t seen, however, is native grails support.

In short, I wanted to build my (private) github-based project every time I pushed to the Openshift repo.  I started with https://github.com/jasonxrowland/openshift-grails-quickstart as a quickstart, but quickly started modifying things.

New Environment Variables

The quickstart made heavy usage of the OPENSHIFT_RUNTIME_DIR and OPENSHIFT_LOG_DIR environment variables, which unfortunately no longer exist.  These are now based on the cartridge name, so I just exported my own versions of the variables in each hook script where they were used (deploy, stop, start, etc):

export OPENSHIFT_RUNTIME_DIR=$OPENSHIFT_HOMEDIR/diy-0.1/runtime
export OPENSHIFT_LOG_DIR=$OPENSHIFT_DIY_LOG_DIR

Cleaning Baggage and Using the Grails Wrapper

The quickstart project is HUGE just because of the grails installation and the grails application included.  I first looked at what I could trim down from there.

I wanted to build my grails application from a github based repo (the latest or a tagged version), so I removed the ‘app’ directory from the repo.  I also wanted to use the grails wrapper, so I removed the ‘diy/grails’ directory as well.  As it turns out, it’s quite easy to get the the grails wrapper to play nicely on openshift, all it needed was some environment variables.

export JAVA_HOME=/etc/alternatives/java_sdk_1.7.0
export JAVA_OPTS="$JAVA_OPTS -Divy.default.ivy.user.dir=$OPENSHIFT_DATA_DIR -Duser.home=$OPENSHIFT_DATA_DIR"
export HOME=$OPENSHIFT_RUNTIME_DIR

Using a GitHub Repo

Perhaps the largest hurdle I needed to overcome was in using a new deploy SSH key for github access since my repo was private.  Openshift locks down their .ssh directory tight, so I needed to use a little GIT_SSH magic to get things working. This required my repo to have the private key, a script for calling SSH with some custom flags, and some work in the Openshift deploy hook to get everything working together nicely.  NOTE: I put the private key in the repo, but you could alternatively just manually copy the private key to the server in a known location.

Also, I wanted to dynamically clone or pull the repo from GitHub on deployment.  When all was said and done, I had a ‘.ssh’ folder in the ‘diy’ base folder inside of the repo containing my private key (id_rsa) and my SSH utility script (called ssh for those confused).  The SSH script looked like the following:

#!/bin/bash
KNOWN_HOSTS_FILE="$OPENSHIFT_RUNTIME_DIR/.ssh/known_hosts"
ID_RSA_FILE="$OPENSHIFT_RUNTIME_DIR/.ssh/id_rsa"
ssh -i $ID_RSA_FILE -o UserKnownHostsFile=$KNOWN_HOSTS_FILE "$@"

This script calls the ssh executable using a different known_hosts file than the default (not even exporting the HOME env variable changes the location for SSH) and uses my private key.

I then put the following code in my .openshift/action_hooks/deploy script (note that previously the environment variables were set as shown above, including HOME):

# Copy the ssh folder into the fake HOME directory and chmod the private key so it is acceptable to ssh
cp -r $OPENSHIFT_REPO_DIR/diy/.ssh $HOME/
chmod 0600 $HOME/.ssh/id_rsa

if [ -d $OPENSHIFT_RUNTIME_DIR/app ]
then
    cd $OPENSHIFT_RUNTIME_DIR/app
    GIT_SSH="$OPENSHIFT_RUNTIME_DIR/.ssh/ssh" GIT_DIR=$OPENSHIFT_RUNTIME_DIR/app git pull
else
    # Checkout the app
    cd $OPENSHIFT_RUNTIME_DIR
    GIT_SSH="$OPENSHIFT_RUNTIME_DIR/.ssh/ssh" git clone REPO_LOCATION app
fi

Make sure the REPO_LOCATION string is replaced with the real github access string.  This script causes the repo to be pulled into the ~/diy-0.1/runtime/app folder (clone if it doesn’t exist, pull if it does) using the ssh script copied in the previous step.

WARNING: This did not work correctly until I pushed the repo, watched the build fail, ssh’d into the openshift server, set the environment variables as shown above, and then manually called the fake ssh script for git@github.com.  This allowed me to add the github server to the known_hosts file and thereafter the key auth worked just fine.

Building and Deploying

Lastly, I tied all the previous pieces together and added the following code to the deploy hook script:

cd $OPENSHIFT_RUNTIME_DIR/app
./grailsw clean --non-interactive
./grailsw compile --non-interactive
./grailsw prod war target/ROOT.war

This cleans, compiles, and creates the WAR file.  Note that these lines replaced the old “grails prod war” command that existed in the deploy script.  The resulting deploy script looked like the following:

#!/bin/bash
set -x
export OPENSHIFT_RUNTIME_DIR=$OPENSHIFT_HOMEDIR/diy-0.1/runtime
export OPENSHIFT_LOG_DIR=$OPENSHIFT_DIY_LOG_DIR
export HOME=$OPENSHIFT_RUNTIME_DIR
export JAVA_HOME=/etc/alternatives/java_sdk_1.7.0
export JAVA_OPTS="$JAVA_OPTS -Divy.default.ivy.user.dir=$OPENSHIFT_DATA_DIR -Duser.home=$OPENSHIFT_DATA_DIR"

if [ ! -d $OPENSHIFT_RUNTIME_DIR/tomcat ]
then
    # Copy Tomcat
    cp -rf $OPENSHIFT_REPO_DIR/diy/tomcat $OPENSHIFT_RUNTIME_DIR
    rm -rf $OPENSHIFT_RUNTIME_DIR/tomcat/logs
    ln -s $OPENSHIFT_LOG_DIR $OPENSHIFT_RUNTIME_DIR/tomcat/logs
fi

cp -r $OPENSHIFT_REPO_DIR/diy/.ssh $HOME/
chmod 0600 $HOME/.ssh/id_rsa

if [ -d $OPENSHIFT_RUNTIME_DIR/app ]
then
    cd $OPENSHIFT_RUNTIME_DIR/app
    GIT_SSH="$OPENSHIFT_RUNTIME_DIR/.ssh/ssh" git pull
else
    # Checkout the app
    cd $OPENSHIFT_RUNTIME_DIR
    GIT_SSH="$OPENSHIFT_RUNTIME_DIR/.ssh/ssh" git clone REPO_LOCATION app
fi

cd $OPENSHIFT_RUNTIME_DIR/tomcat
sed -ig 's/OPENSHIFT_DATA_DIR/'$OPENSHIFT_APP_DNS'/' conf/server.xml

cd $OPENSHIFT_RUNTIME_DIR/app
./grailsw clean --non-interactive
./grailsw compile --non-interactive
./grailsw prod war target/ROOT.war

# Resetting some variables since these seemed to get reset by the grails commands
export OPENSHIFT_RUNTIME_DIR=$OPENSHIFT_HOMEDIR/diy-0.1/runtime
cp target/ROOT.war $OPENSHIFT_RUNTIME_DIR/tomcat/webapps/ROOT.war
rm -rf $OPENSHIFT_RUNTIME_DIR/tomcat/webapps/ROOT

And there you have it!  Only one more step was needed to get my app actually running.

Datasource Configuration

I added the MySQL cartridge to the Openshift application, then changed my DataSource production configuration in the grails application to the following:

def credentials = [
    hostname:System.getenv("OPENSHIFT_MYSQL_DB_HOST"),
    port:System.getenv("OPENSHIFT_MYSQL_DB_PORT"),
    username:System.getenv("OPENSHIFT_MYSQL_DB_USERNAME")
    password:System.getenv("OPENSHIFT_MYSQL_DB_PASSWORD")
    name:"APPLICATION_NAME"
]

dataSource {
    dbCreate = "create-drop"
    driverClassName = "com.mysql.jdbc.Driver"
    url = "jdbc:mysql://${credentials.hostname}:${credentials.port}/${credentials.name}?useUnicode=yes&characterEncoding=UTF-8"
    username = credentials.username
    password = credentials.password
    pooled = true
}

UPDATE: Fixed redeployment of tomcat application and try to fix update of repository. Currently the git pull fails with an error message that the current directory is not a git repository. A workaround would be to remove the app directory each time and re-clone it. I will post a fix if I find it.

This entry was posted in How To and tagged , , , , . Bookmark the permalink.

Leave a Reply

Your email address will not be published. Required fields are marked *

*

You may use these HTML tags and attributes: <a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <strike> <strong>