Thursday, October 10, 2019

Install just the MongoDB Shell on Mac OSX

If you're going through the 'MongoDB university' tutorials, they suggest installing all of MongoDB on your Mac.   This is probably not what you want if you are running a Docker-based local environment - You probably want to run the server in a container, and you just want to connect to that container, or your Atlas cloud-based servers from the command line.

Fortunately, there's the MongoDB brew tap!

To use this, just:

  1. Make sure you have Homebrew installed.
  2. Install the tap:  brew tap mongodb/brew
  3. Install the MongoDB command line: brew install mongodb-community-shell
Credits to this answer on dba.stackexchange!



Saturday, November 10, 2018

Looking for OpenJDK for Windows?

If you're looking for a Windows build of OpenJDK, you will be happy to know that several organizations provide pre-built versions of OpenJDK for Windows:
  • Azul - https://www.azul.com/downloads/zulu/zulu-windows/
  • Red Hat - https://developers.redhat.com/products/openjdk/overview/
As always, you should read the terms and conditions to see if these binary distributions are right for your use case.

Or, you could build it yourself.

Saturday, November 3, 2018

Upgrading Java - TL;DR : Stick with OpenJDK LTS versions

With the new, faster release cycle of Java there is a lot of confusion about when to upgrade.   Upgrading Java is non-trivial for a large system, so it needs to be thought through carefully - you have to think about the IDE, all your tools, and all the dependencies.

This awesome blog post covers a lot of the details.

Here's another awesome post about upgrading to Java 11

However, if you are using the Spring Framework, this decision is made simpler: Spring is only officially supporting LTS releases.   I think this makes a lot of sense.  For example: Spring Boot 2.1 Released - Java 11 Support

In my day job, the Spring Boot apps lean heavily on Spring's dependencies, so that we can leverage all the testing done by the Spring community.   Yes, that means working with something that's not the absolute bleeding edge, but using bleeding edge stuff is not the goal for us.  The goal is to produce applications that delight our users and add value to the business.

Sunday, October 28, 2018

Slimming down a Spring Boot app for testing

One of my favorite things about Spring Boot is the ability to launch an application in embedded mode and do some pseudo-integration-testing (that is, it's integration testing because I'm able to call the embedded app over the loopback network, as if the test were running on a different machine).   Of course you can launch your 'real' application that lives in your 'main' source folder, and you can enable / disable parts of the application with Spring profiles.

But what if you want to create a specialized Spring Boot app, just for testing?   Well, you can!

For example, in src/test/org/example/test/app/TestServer.kt we can make an app class (btw, this is Kotlin, just so you know):

package org.example.test.app

...imports blah...

@SpringBootApplication()
@Import(SomeConfig::class)
class TestGraphQLClientApp {
    /**
     * Defines the main resolvers: Query and Mutation.
     */
    @Bean
    fun resolvers(query: GraphQLQueryResolver) = listOf(query)

}

  • This is a GraphQL server, and I'm going to test a simple GraphQL client with it.   There are more components behind the scenes.
  • I'm putting it in the app sub-package to avoid loading any component in the main app or anywhere else unintentionally.    Remember that a Spring Boot app class implies a 'component scan' of the package it lives in, and all sub-packages!
  • SomeConfig is imported, and this brings in whatever components from the main code or elsewhere.
  • Specialized test components can be defined in packages or sub-packages.
We can make a test like this:

package org.example.test

... imports blah ...

@RunWith(SpringJUnit4ClassRunner::class)
@SpringBootTest(classes = [TestGraphQLClientApp::class, HttpClientConfiguration::class],
        webEnvironment = SpringBootTest.WebEnvironment.RANDOM_PORT)
class GraphQLClientTest {
    companion object : KLogging()

    @LocalServerPort
    private val port: Int = 0

    @Autowired
    private lateinit var factory: RestTemplateFactory

    // These have to be 'by lazy' because Spring will inject the fields they rely on after init.

    private val template by lazy { factory.createRestTemplate() }

    private val baseUrl by lazy { "http://localhost:$port/graphql" }

    private val client by lazy { GraphQLClient(baseUrl, template) }

    @Test
    fun basicClientTest() {
        client.query("query { foo }").also { value ->
            logger.info { prettyPrint(value) }
            assertEquals("foo", assertHasField(value, "data", "foo").asText())
        }
        client.query("query { getThing(id: \"12345-ABC\") { one two } }").also {
            logger.info { prettyPrint(it) }
        }
    }
}
  • Note that the default 'properties' will be loaded from application.properties or application.yml.  If we want to override this, we should probably make a profile and use it from the test.
So what's the problem?   The problem is that, in this particular context I have JPA and a few other Spring Boot 'starter' dependencies.   So, when the test class starts the Spring Boot app, it launches:
  1. A JDBC Data Source
  2. Liquibase (my preferred data migration tool)
  3. JPA and Hibernate
Those are all great tools, and it's super convenient to have all these 'auto starters' but they are not needed in this particular test.   How to turn them off and "slim" down my application?   There are two approaches:
  1. Create a profile, and disable some things in that profile - This works for the autostart modules that support it, but not all of them do.
  2. Use 'exclude' in SpringBootApplication to disable the autostart modules.
Using exclude is easy, and since it prevents Spring from loading the autostart modules in the first place, it can reduce startup time.   So for a simple web app, we can disable all the database modules:

@SpringBootApplication(exclude = [
    LiquibaseAutoConfiguration::class,
    DataSourceAutoConfiguration::class,
    DataSourceTransactionManagerAutoConfiguration::class,
    HibernateJpaAutoConfiguration::class])

That's all we need to do!   Now the test app starts up in about 9 seconds.

See also:



Saturday, June 24, 2017

Kotlin: Getting Started - Part 1

I've been using Kotlin for a month or so now, and here are some of the smaller things that I found very useful.

Some Basic Patterns

Here are some basic patterns that you probably (hopefully) use in Java, and how to do them with Kotlin.

Builder/Constructor

In Java you may have classes with many overloaded constructors, or with constructors that have lots of nullable parameters.   I think we can all agree that this is just all around tedious for both the producer of the class and the consumer / caller.   You might decide to make a builder, which allows you to have a single constructor, and have immutable fields in your target class.   While this can make things a bit easier for the consumer/caller,  it's still a lot of boilerplate code.

 Kotlin provides with two features that can help with this in some cases:
  1. Optional parameters with default values
  2. Named parameters
Of course, you can still make a builder if you want.  Also, you can define secondary constructors.

Caveat for Java integration: These features are not available for Java code calling Kotlin, so don't expect this to magically improve Java.

Memoize

It's really easy to create an 'initialize once' field in Kotlin: just use: as lazy { ... }


You can chain these together, due to the fact that the 'as lazy' properties act just like normal properties.



Integrating with Existing Java Code

If you're considering Kotlin, but you have a bunch of existing Java code you might be concerned about using Kotlin into your code base.   Fortunately, Kotlin support is really easy to add to your build, and it's very easy to interoperate between Kotlin and Java.

Adding Kotlin Support - Gradle

To add Kotlin compilation support to an existing Gradle build.


  1. Put the Kotlin plugins in the buildscript class path:

    buildscript {
        ext {
            kotlinVersion = '1.1.2-4'
        }
        repositories {
            jcenter()
            maven { url "https://plugins.gradle.org/m2/" }
        }
    
        dependencies {    // Gradle Plugin classpath.
            classpath("org.jetbrains.kotlin:kotlin-gradle-plugin:${kotlinVersion}")
            classpath("org.jetbrains.kotlin:kotlin-allopen:${kotlinVersion}")
            classpath "org.jetbrains.kotlin:kotlin-gradle-plugin:${kotlinVersion}"
        }
    }
    

    NOTE: I'm avoiding the newer Gradle plugin configuration syntax because it does not support string interpolation. You have to repeat the kotlinVersion information over and over.
  2. Enable the Kotlin plugins:
    
    
        apply plugin: 'kotlin'
        apply plugin: 'kotlin-spring'
    
  3. 
    
  4. Set the target JVM (optional, but I prefer to do this):

        compileKotlin {
            kotlinOptions.jvmTarget = "1.8"
        }
    
        compileTestKotlin {
            kotlinOptions.jvmTarget = "1.8"
        }
    

  5. Add the Kotlin runtime library dependencies:

        dependencies { 
            // Kotlin/JVM libraries
            compile("org.jetbrains.kotlin:kotlin-stdlib:${kotlinVersion}")
            compile("org.jetbrains.kotlin:kotlin-stdlib-jre8:${kotlinVersion}")
            compile("org.jetbrains.kotlin:kotlin-reflect:${kotlinVersion}")
            // Kotlin SLF4J utility
            compile 'io.github.microutils:kotlin-logging:1.4.4'
        }
    

Calling Kotlin Functions from Java


Java will see Kotlin functions as static methods in a class named like this: <package><KotlinFileName>Kt.   Basically, just the class name you might expect, plus 'Kt' on the end.



Thursday, February 16, 2017

Tips for Building Docker Images

A few tips for building Docker images, from my experience so far.

Stick with the conventions

As with most tools, it's best to start with the conventions and stick with them unless you have a very compelling reason to customize ("you are not special").   Some of the important conventions with Docker images:

  • Put your Dockerfile in the root directory of your project (git repo).
  • Base your image on another image! This allows you to inherit all the environment variables and such from the parent. Also, if it's in docker hub, you can refer to the documentation.
  • Add ENV, ENTRYPOINT and EXPOSE instructions in your Dockerfile.  This will tell image users how to configure your image.
  • Add comments to indicate what files / directories can be overridden with 'volumes' for configuration.
  • Use ARG to allow you to pass in a variable during build time.   This is really good for version numbers, etc.

Create The Image

To create the image, just do docker build from the root directory of the project:
docker build -t test-image --force-rm .

Where:
  • -t test-image : gives the image a name (tag) in the local docker environment.
  • --force-rm : removes intermediate containers

Parameterized Image Building with ARG

If you have an image where you need to download a version of some file and you would like to not update the Dockerfile for every version, you can use ARG to define a variable that you can pass in to docker build like this:

Dockerfile

FROM openjdk:8-jre-alpine

EXPOSE 9324

ARG ELASTICMQ_VERSION=0.13.2

CMD ["java", "-jar", "-Dconfig.file=/elasticmq/custom.conf", "/elasticmq/server.jar"]
COPY custom.conf /elasticmq/custom.conf

ADD "https://s3-eu-west-1.amazonaws.com/softwaremill-public/elasticmq-server-${ELASTICMQ_VERSION}.jar" /elasticmq/server.jar

  • The ARG defines ELASTICMQ_VERSION as an expected argument at build time.
You can then build this image, overriding the ELASTICMQ_VERSION, like this:
docker build -t=my-elasticmq:${VER} --force-rm --build-arg ELASTICMQ_VERSION=${VER}
Where:
  • -t test-image : gives the image a name (tag) in the local docker environment.
  • --force-rm : removes intermediate containers



Explore The Image

So, if you want to shell around and look at what is in the image, you can do that easily with:

docker run -it --rm --entrypoint /bin/bash test-image
Where
  • -it : runs an interactive terminal session
  • --rm : removes the container on exit (this is really useful! Saves on having to clean up containers all the time.)
  • --entrypoint /bin/bash : the shell you want to use. We want to override the entry point so the container won't fully start whatever it usually does.
  • test-image : The image we want to start, if you gave it a name.

Tuesday, February 7, 2017

Install Groovy in an Alpine-based Docker Image

If you're making a custom image based on an Alpine Linux image, you may have a little trouble installing things that require bash, like Groovy.    I tried using SDKMAN, but unfortunately I encountered a lot of problems with compatibility of unzip, and other tools.   In my case I'm creating an image based on Tomcat and I want Groovy for doing some configuration work.

First, we install the Alpine packages we need:
  1. bash
  2. curl
  3. zip
  4. libstdc++ (Gradle needed this, but I don't think Groovy does :shrug:)

RUN apk add --update bash libstdc++ curl zip && \
    rm -rf /var/cache/apk/*

Now we need a workaround for fact that Groovy's shell scripts start with #!/bin/sh :

# Workaround  https://issues.apache.org/jira/browse/GROOVY-7906 and other 'busybox' related issues.
RUN rm /bin/sh && ln -s /bin/bash /bin/sh

Now we can install Groovy. This could probably be done a little more optimally, but it works:
# Install groovy
# Use curl -L to follow redirects
# Also, use sed to make a workaround for https://issues.apache.org/jira/browse/GROOVY-7906
RUN curl -L https://bintray.com/artifact/download/groovy/maven/apache-groovy-binary-2.4.8.zip -o /tmp/groovy.zip && \
    cd /usr/local && \
    unzip /tmp/groovy.zip && \
    rm /tmp/groovy.zip && \
    ln -s /usr/local/groovy-2.4.8 groovy && \
    /usr/local/groovy/bin/groovy -v && \
    cd /usr/local/bin && \
    ln -s /usr/local/groovy/bin/groovy groovy

As always, any suggestions about how to make it better, let me know.