Loading...

Tuesday, September 1, 2015

Spocklight: Auto Cleanup Resources

Spcok has a lot of nice extensions we can use in our specifications. The AutoCleanup extension makes sure the close() method of an object is called each time a feature method is finished. We could invoke the close() method also from the cleanup method in our specification, but with the @AutoCleanup annotation it is easier and immediately shows our intention. If the object we apply the annotation to doesn't have a close() method to invoke we can specify the method name as the value for the annotation. Finally we can set the attribute quiet to true if we don't want to see any exceptions that are raised when the close() method (or custom method name, that is specified) is invoked.

In the following example code we have a specification that is testing the WatchService implementation. The implementation also implements the Closeable interface, which means we can use the close() method to cleanup the object properly. We also have a custom class WorkDir with a delete() method that needs to be invoked.

package com.mrhaki.spock

import spock.lang.AutoCleanup
import spock.lang.Specification

import java.nio.file.FileSystems
import java.nio.file.Files
import java.nio.file.Path
import java.nio.file.Paths
import java.nio.file.WatchKey
import java.nio.file.WatchService
import java.util.concurrent.TimeUnit

import static java.nio.file.StandardWatchEventKinds.ENTRY_CREATE

class WatchSpec extends Specification {

    // Use close() method of WatchService
    // when feature method is done.
    @AutoCleanup
    private WatchService watchService

    // Use delete() method of WorkDir class
    // when feature method is done.
    // If the quiet attribute is true, then
    // exceptions from the delete() method are
    // not shown, otherwise exceptions are reported.
    @AutoCleanup(value = 'delete', quiet = true)
    private WorkDir testPath = new WorkDir('test-dir')

    def setup() {
        // Get new watch service.
        watchService = FileSystems.default.newWatchService()

        // Register for events when a new file is created
        // in the testPath directory.
        testPath.path.register(watchService, ENTRY_CREATE)
    }

    def "get notification when file is created"() {
        given:
        final Path testFile = testPath.path.resolve('test-file')
        testFile << 'sample'

        and:
        final WatchKey watchKey = watchService.poll(10, TimeUnit.SECONDS)

        when:
        final events = watchKey.pollEvents()

        then:
        events.size() == 1
        events[0].kind() == ENTRY_CREATE

        cleanup:
        Files.delete(testFile)
    }

}

class WorkDir {

    private final Path path

    WorkDir(final String dir) {
        path = Paths.get(dir)
        Files.createDirectories(path)
    }

    Path getPath() {
        path
    }

    void delete() {
        Files.deleteIfExists(path)
    }

}

Written with Spock 1.0-groovy-2.4.

Friday, August 28, 2015

Spocklight: Optimize Run Order Test Methods

Spock is able to change the execution order of test methods in a specification. We can tell Spock to re-run failing methods before successful methods. And if we have multiple failing or successful tests, than first run the fastest methods, followed by the slower methods. This way when we re-run the specification we immediately see the failing methods and could stop the execution and fix the errors. We must set the property optimizeRunOrder in the runner configuration of the Spock configuration file. A Spock configuration file with the name SpockConfig.groovy can be placed in the classpath of our test execution or in our USER_HOME/.spock directory. We can also use the Java system property spock.configuration and assign the filename of our Spock configuration file.

In the following example we have a specification with different methods that can be successful or fail and have different durations when executed:

package com.mrhaki.spock

import spock.lang.Specification
import spock.lang.Subject

class SampleSpec extends Specification {

    @Subject
    private Sample sample = new Sample()

    def "spec1 - slowly should return name property value"() {
        given:
        sample.name = testValue

        expect:
        sample.slowly() == testValue

        where:
        testValue = 'Spock rules'
    }

    def "spec2 - check name property"() {
        given:
        sample.name = testValue

        expect:
        sample.name == testValue

        where:
        testValue = 'Spock is gr8'
    }

    def "spec3 - purposely fail test at random"() {
        given:
        sample.name = testValues[randomIndex]

        expect:
        sample.name == testValues[0]

        where:
        testValues = ['Spock rules', 'Spock is gr8']
        randomIndex = new Random().nextInt(testValues.size())
    }

    def "spec4 - purposely fail test slowly"() {
        given:
        sample.name = 'Spock is gr8'

        expect:
        sample.slowly() == 'Spock rules'
    }

    def "spec5 - purposely fail test"() {
        given:
        sample.name = 'Spock rules'

        expect:
        sample.name == 'Spock is gr8'
    }
}

class Sample {
    String name

    String slowly() {
        Thread.sleep(2000)
        name
    }
}

Let's run our test where there is no optimised run order. We see the methods are executed as defined in the specification:

Next we create a Spock configuration file with the following contents:

runner {
    println "Optimize run order"
    optimizeRunOrder true
}

If we re-run our specification and have this file in the classpath we already see the order of the methods has changed. The failing tests are at the top and the successful tests are at the bottom. The slowest test method is last:

Another re-run has optimised the order by running the slowest failing test after the other failing tests.

Spock keeps track of the failing and successful methods and their execution time in a file with the specification name in the USER_HOME/.spock/RunHistory directory. To reset the information we must delete the file from this directory.

Written with Spock 1.0-groovy-2.4.

Thursday, August 27, 2015

Gradle Goodness: Quickly Open Test Report in IntelliJ IDEA

When we execute a Gradle test task in IntelliJ IDEA the IDE test runner is used. This means we can get a nice overview of all tests that have run. If a test is successful we get a green light, otherwise it is red in case of an error or orange in case of a failure. Gradle also generates a nice HTML report when we run the test task. It is placed in the directory build/reports/tests/. IntelliJ IDEA has a button which opens this report when we click on it. The following screenshot shows the button with the tooltip that opens a Gradle test report:

Written with IntelliJ IDEA 14 and Gradle 2.6.

Spocklight: Include or Exclude Specifications Based On Class or Interface

In a previous post we saw how can use the Spock configuration file to include or exclude specifications based on annotations. Instead of using annotations we can also use classes to include or exclude specifications. For example we could have a base specification class DatabaseSpecification. Other specifications dealing with databases extend this class. To include all these specifications we use the DatabaseSpecification as value for the include property for the test runner configuration.

Because Java (and Groovy) doesn't support real multiple inheritance this might be a problem if we already have specifications that extends a base class, but the base class cannot be used as filter for the include and exclude runner configuration. Luckily we can also use an interface as the value for the inclusion or exclusion. So we could simple create a marker interface and implement this interface for these specifications we want to include or exclude from the test execution.

We rewrite the sample from our previous post to show how we can use a marker interface. First we take a look at the interface definition:

package com.mrhaki.spock

interface RemoteSpec {
}

We used the @Remote annotation on a specification method, but an interface is implemented on class. Therefore we must refactor our specification into three classes. First we create a base specification class:

package com.mrhaki.spock

import spock.lang.Specification
import spock.lang.Subject

abstract class WordRepositorySpec<S extends Access> extends Specification {

    @Subject
    S access

    def "test remote or local access"() {
        expect:
        access.findWords('S') == ['Spock']
    }

}

Next we write two new specification classes that extends this abstract class and each uses a different implementation for the class under test.

package com.mrhaki.spock

class LocalWordRepositorySpec 
        extends WordRepositorySpec<LocalAccess> {

    def setup() {
        access = new LocalAccess()
    }

}
package com.mrhaki.spock

class RemoteWordRepositorySpec 
        extends WordRepositorySpec<RemoteAccess>
        // Implement RemoteSpec marker interface
        implements RemoteSpec {

    def setup() {
        access = new RemoteAccess()
    }

}

Instead of using an annotation we use the marker interface RemoteSpec to include it for our test execution in the following configuration:

import com.mrhaki.spock.RemoteSpec

runner {
    println "Using RemoteSpockConfig"

    // Include only test classes that
    // implement the RemoteSpec interface.
    include RemoteSpec

    // Alternative syntax
    // to only look for classes or interfaces.
    // include {
    //     baseClass RemoteSpec
    // }

    // We can also add a condition in
    // the configuration file.
    // In this case we check for a Java
    // system property and if set the
    // specs with interface RemoteSpec
    // are not run.
    if (System.properties['spock.ignore.Remote']) {
        exclude RemoteSpec
    }
}

If we run the specifications with the same Gradle build file as the previous post we see that with the default test task all specifications are executed:

And when we execute the remoteTest task only the RemoteWordRepositorySpec is executed:

Written with Gradle 2.6 and Spock 1.0-groovy-2.4.

Code is available on Github

Spocklight: Including or Excluding Specifications Based On Annotations

One of the lesser known and documented features of Spock if the external Spock configuration file. In this file we can for example specify which specifications to include or exclude from a test run. We can specify a class name (for example a base specification class, like DatabaseSpec) or an annotation. In this post we see how to use annotations to have some specifications run and others not.

The external Spock configuration file is actually a Groovy script file. We must specify a runner method with a closure argument where we configure basically the test runner. To include specification classes or methods with a certain annotation applied to them we configure the include property of the test runner. To exclude a class or method we use the exclude property. Because the configuration file is a Groovy script we can use everything Groovy has to offer, like conditional statements, println statements and more.

Spock looks for a file named SpockConfig.groovy in the classpath of the test execution and in in the USER_HOME/.spock directory. We can also use the Java system property spock.configuration with a file name for the configuration file.

In the following example we first define a simple annotation Remote. This annotation can be applied to a class or method:

package com.mrhaki.spock

import java.lang.annotation.ElementType
import java.lang.annotation.Retention
import java.lang.annotation.RetentionPolicy
import java.lang.annotation.Target

@Target([ElementType.TYPE, ElementType.METHOD])
@Retention(RetentionPolicy.RUNTIME)
@interface Remote {
}

We write a simple Spock specification where we apply the Remote annotation to one of the methods:

package com.mrhaki.spock

import spock.lang.Specification

class WordRepositorySpec extends Specification {

    @Remote  // Apply our Remote annotation.
    def "test remote access"() {
        given:
        final RemoteAccess access = new RemoteAccess()

        expect:
        access.findWords('S') == ['Spock']
    }

    def "test local access"() {
        given:
        final LocalAccess access = new LocalAccess()

        expect:
        access.findWords('S') == ['Spock']
    }

}

Next we create a Spock configuration file:

import com.mrhaki.spock.Remote

runner {
    // This is Groovy script and we 
    // add arbitrary code.
    println "Using RemoteSpockConfig"

    // Include only test classes or test
    // methods with the @Remote annotation
    include Remote

    // Alternative syntax
    // to only look for annotations.
    // include {
    //     annotation Remote
    // }

    
    // We can also add a condition in
    // the configuration file.
    // In this case we check for a Java
    // system property and if set the
    // specs with @Remote are not run.
    if (System.properties['spock.ignore.Remote']) {
        exclude Remote
    }
}

When we run the WordRepositorySpec and our configuration file is on the classpath only the specifications with the @Remote annotation are executed. Let's apply this in a simple Gradle build file. In this case we save the configuration file as src/test/resources/RemoteSpockConfig.groovy, we create a new test task remoteTest and set the Java system property spock.configuration:

apply plugin: 'groovy'

repositories {
    jcenter()
}

dependencies {
    compile 'org.codehaus.groovy:groovy-all:2.4.4'
    testCompile 'org.spockframework:spock-core:1.0-groovy-2.4'
}

// New test task with specific 
// Spock configuration file.
task remoteTest(type: Test) {
    // This task belongs to Verification task group.
    group = 'Verification'

    // Set Spock configuration file when running
    // this test task.
    systemProperty 'spock.configuration', 'RemoteSpockConfig.groovy'
}

Now when we execute the Gradle test task all specifications are executed:

And when we run remoteTest only the specification with the @Remote annotation are executed:

Written with Gradle 2.6 and Spock 1.0-groovy-2.4.

The code is available on Github

Wednesday, August 26, 2015

Show Logback Configuration Status with Groovy Configuration

Logback is a SLF4J API implementation for logging messages in Java and Groovy. We can configure Logback with a Groovy configuration file. The file is a Groovy script and allows for a nice an clean way (no XML) to configure Logback. If we want to show the logging configuration and see how Logback is configured we must add a StatusListener implementation in our configuration. The StatusListener implementation prints out the configuration when our application starts. Logback provides a StatusListener for outputting the information to system out or system error streams (OnConsoleStatusListener and OnErrorConsoleStatusListener). If we want to disable any status messages we use the NopStatusListener class.

In the following example configuration file we define the status listener with the statusListener method. We use the OnConsoleStatusListener in our example:

// Add status listener, which prints out
// the Logback configuration on application
// startup.
statusListener(OnConsoleStatusListener)

appender('SystemOut', ConsoleAppender) {
    // Enable coloured output.
    withJansi = true

    encoder(PatternLayoutEncoder) {
        pattern = "%blue(%-5level) %green(%logger{35}) - %msg %n"
    }
}

root(DEBUG, ['SystemOut'])

If we run our Java or Groovy application with this configuration we get the following information in our console:

    09:31:09,143 |-INFO in ch.qos.logback.classic.gaffer.ConfigurationDelegate@3b0f5da3 - Added status listener of type [ch.qos.logback.core.status.OnConsoleStatusListener]
    09:31:09,167 |-INFO in ch.qos.logback.classic.gaffer.ConfigurationDelegate@3b0f5da3 - About to instantiate appender of type [ch.qos.logback.core.ConsoleAppender]
    09:31:09,168 |-INFO in ch.qos.logback.classic.gaffer.ConfigurationDelegate@3b0f5da3 - Naming appender as [SystemOut]
    09:31:09,239 |-INFO in ch.qos.logback.classic.gaffer.ConfigurationDelegate@3b0f5da3 - Setting level of logger [ROOT] to DEBUG
    09:31:09,244 |-INFO in ch.qos.logback.classic.gaffer.ConfigurationDelegate@3b0f5da3 - Attaching appender named [SystemOut] to Logger[ROOT]

Alternatively we can use the Java system property logback.statusListenerClass when we run our application. We must provide the full class name for the StatusListener implementation. This will overrule any status listeners that are defined in configuration files.

Written with Logback 1.1.3.

Tuesday, August 25, 2015

Gradle Goodness Notebook Updated

Gradle Goodness Notebook has been updated. If you have bought the book you can download the latest version for free. The following blog posts have been added:

  • Define System Properties in gradle.properties File
  • Alter Start Scripts from Application Plugin
  • Handle Copying Duplicate Files
  • Use Git Commit Id in Build Script
  • Using Continuous Build Feature

Grails Goodness Notebook Updated

Grails Goodness Notebook has been updated with the latest blog posts. If you have purchased the book before you can download the latest version of the book for free.

  • Create New Application without Wrapper
  • Add Banner to Grails Application
  • Set Log Level for Grails Artifacts
  • Add Some Color to Our Logging
  • Save Application PID in File
  • Log Startup Info
  • Adding Health Check Indicators
  • Custom Data Binding with @DataBinding Annotation

Monday, August 24, 2015

Groovy Goodness Notebook Is Updated

Groovy Goodness Notebook which contains the Groovy blog posts in an organised form is updated with the following blog posts. If you have purchased the book before you get the update for free.

  •  Relax... Groovy Will Parse Your Wicked JSON
  •  Nested Templates with MarkupTemplateEngine
  •  Use Custom Template Class with MarkupTemplateEngine
  •  Using Layouts with MarkupTemplateEngine
  •  Closure as a Class
  •  Take Or Drop Last Items From a Collection
  •  Getting All But the Last Element in a Collection with Init Method
  •  Getting the Indices of a Collection
  •  Pop And Push Items In a List
  •  Access XML-RPC API
  •  Use Constructor as Method Pointer
  •  Combine Elements Iterable with Index
  •  Swapping Elements in a Collection
  •  New Methods to Sort and Remove Duplicates From Collection
  •  Use Closures as Java Lambda Expressions
  •  Share Data in Concurrent Environment with Dataflow Variables

Redirect Logging Output to Standard Error with Logback

When we use Logback as SLF4J API implementation in our code we can have our logging output send to our console. By default the standard output is used to display the logging output. We can alter the configuration and have the logging output for the console send to standard error. This can be useful when we use a framework that uses the standard output for communication and we still want to see logging from our application on the console. We set the property target for the ConsoleAppender to the value System.err instead of the default System.out.

The following sample Logback configuration in Groovy send the logging output to the console and standard error:

appender("SystemErr", ConsoleAppender) {
    // Enable coloured output.
    withJansi = true

    encoder(PatternLayoutEncoder) {
        pattern = "%blue(%-5level) %green(%logger{35}) - %msg %n"
    }

    // Redirect output to the System.err.
    target = 'System.err'

}

root(DEBUG, ["SystemErr"])

If we want to use the XML format we get the following configuration:

<?xml version="1.0" encoding="UTF-8"?>
<configuration>

    <appender name="SystemErr"
              class="ch.qos.logback.core.ConsoleAppender"
              target="System.err"
              withJansi="true">

        <encoder>
            <pattern>%blue(%-5level) %green(%logger{35}) - %msg %n</pattern>
        </encoder>

    </appender>

    <root level="DEBUG">
        <appender-ref ref="SystemErr"/>
    </root>

</configuration>

Written with Logback 1.1.3.