Loading...

Tuesday, October 21, 2014

Gradle Goodness: Changing Name of Default Build File

Gradle uses the name build.gradle as the default name for a build file. If we write our build code in a file build.gradle then we don't have to specify the build filename when we run tasks. We can create build files with a different name other than build.gradle. For example we can define our build logic in a file sample.gradle. To run the tasks from this build file we can use the command line option -b or --build-file followed by the file name. But we can also change the project settings and set a new default build file name for our project. With the changed project settings we do not have to use the command line options -b or --build-file.

Suppose we have the following build file with the name sample.gradle:

// File: sample.gradle
task sample(description: 'Sample task') << {
    println 'Sample task'
}

defaultTasks 'sample'

To run the sample task from the command line we can use the command line options -b or --build-file:

$ gradle -b sample.gradle
:sample
Sample task

BUILD SUCCESSFUL

Total time: 3.168 secs
$ gradle --build-file sample.gradle 
:sample
Sample task

BUILD SUCCESSFUL

Total time: 2.148 secs
$

To change the default build file name for our project we create a file settings.gradle in our project. Inside the settings.gradle file we can change the property buildFileName for rootProject:

// File: settings.gradle
// Change default build file name for this project.
rootProject.buildFileName = 'sample.gradle'

Now we execute the tasks from sample.gradle without the options -b or --build-file:

$ gradle
:sample
Sample task

BUILD SUCCESSFUL

Total time: 3.312 secs
$

Code written with Gradle 2.1.

Wednesday, October 15, 2014

Gradle Goodness: Show Standard Out or Error Output from Tests

We use the Test task in Gradle to run tests. If we use the System.out.println or System.err.println methods in our test we don't see the output when we execute the tests. We can customize the test task to show any output send to standard out or error in the Gradle output.

First we show our test class written with Spock, but it could also be a JUnit or TestNG test:

// File: src/test/groovy/com/mrhaki/gradle/SampleSpec.groovy
package com.mrhaki.gradle

import spock.lang.*

class SampleSpec extends Specification {

    def "check that Gradle is Gr8"() {
        when:
        def value = 'Gradle is great!'

        then:
        // Include a println statement, so
        // we have output to show.
        println "Value = [$value]"
        value == 'Gradle is great!'
    }

}

Now we write a simple Gradle build file which can execute our test:

// File: build.gradle
apply plugin: 'groovy' // Adds test task

repositories.jcenter()

dependencies {
    compile 'org.codehaus.groovy:groovy-all:2.3.7'
    testCompile 'org.spockframework:spock-core:0.7-groovy-2.0'
}

Let's run the test task from the command line and look at the output:

$ gradle test
:compileJava UP-TO-DATE
:compileGroovy UP-TO-DATE
:processResources UP-TO-DATE
:classes UP-TO-DATE
:compileTestJava
:compileTestGroovy
:processTestResources UP-TO-DATE
:testClasses
:test

BUILD SUCCESSFUL

Total time: 7.022 secs
$

Well at least our test is successful, but we don't see the output of our println method invocation in the test. We customize the test task and add the testLogging method with a configuration closure. In the closure we set the property showStandardStreams to the value true. Alternatively we can set the events property or use the events method with the values standard_out and standard_err to achieve the same result. In the next build file we use the showStandardStreams property:

// File: build.gradle
apply plugin: 'groovy' // Adds test task

repositories.jcenter()

dependencies {
    compile 'org.codehaus.groovy:groovy-all:2.3.7'
    testCompile 'org.spockframework:spock-core:0.7-groovy-2.0'
}

test {
    testLogging {
        // Make sure output from 
        // standard out or error is shown
        // in Gradle output.
        showStandardStreams = true

        // Or we use events method:
        // events 'standard_out', 'standard_error'

        // Or set property events:
        // events = ['standard_out', 'standard_error']

        // Instead of string values we can
        // use enum values:
        // events org.gradle.api.tasks.testing.logging.TestLogEvent.STANDARD_OUT,
        //        org.gradle.api.tasks.testing.logging.TestLogEvent.STANDARD_ERROR,
    }
}

We re-run the test task from the command line and look at the output to see the result from the println method:

$ gradle test
:compileJava UP-TO-DATE
:compileGroovy UP-TO-DATE
:processResources UP-TO-DATE
:classes UP-TO-DATE
:compileTestJava
:compileTestGroovy
:processTestResources UP-TO-DATE
:testClasses
:test

com.mrhaki.gradle.SampleSpec > check that Gradle is Gr8 STANDARD_OUT
    Value = [Gradle is great!]

BUILD SUCCESSFUL

Total time: 8.716 secs
$

Written with Gradle 2.1.

Tuesday, October 14, 2014

Groovy Goodness: Closure as a Class

When we write Groovy code there is a big chance we also write some closures. If we are working with collections for example and use the each, collect or find methods we use closures as arguments for these methods. We can assign closures to variables and use the variable name to reference to closure. But we can also create a subclass of the Closure class to implement a closure. Then we use an instance of the new closure class wherever a closure can be used.

To write a closure as a class we must subclass Closure and implement a method with the name doCall. The method can accept arbitrary arguments and the return type can be defined by us. So we are not overriding a method doCall from the superclass Closure. But Groovy will look for a method with the name doCall to execute the closure logic and internally use methods from the Closure superclass.

In the following sample we write a very simple closure as a class to check if an object is a number. Then we use an instance of the class with the findAll method for a collection of objects:

class IsNumber extends Closure<Boolean> /* return type for closure as generic type */ {

    IsNumber() {
        super(null)
    }

    /**
     * Implementation of closure.
     */
    Boolean doCall(final Object value) {
        // Check if value is a number, if so
        // return true, otherwise false.
        value in Number
    }

}

def list = ['a', 100, 'Groovy', 1, 8, 42.0, true]

def numbers = list.findAll(new IsNumber())

assert numbers == [100, 1, 8, 42.0]

Code written with Groovy 2.3.7.

Monday, October 13, 2014

Spocklight: Indicate Class Under Test with Subject Annotation

If we write a specification for a specific class we can indicate that class with the @Subject annotation. This annotation is only for informational purposes, but can help in making sure we understand which class we are writing the specifications for. The annotation can either be used at class level or field level. If we use the annotation at class level we must specify the class or classes under test as argument for the annotation. If we apply the annotation to a field, the type of the field is used as the class under test. The field can be part of the class definition, but we can also apply the @Subject annotation to fields inside a feature method.

In the following example Spock specification we write a specification for the class Greet. The definition of the Greet class is also in the code listing. We use the @Subject annotation on the field greet to indicate this instance of the Greet class is the class we are testing here. The code also works with the @Subject annotation, but it adds more clarity to the specification.

package com.mrhaki.spock

@Grab('org.spockframework:spock-core:0.7-groovy-2.0')
import spock.lang.*

// The @Subject annotation can also be applied at class level.
// We must specify the class or classes as arguments:
// @Subject([Greet])
class GreetSpec extends Specification {

    // The greet variable of type Greet is the 
    // class we are testing in this specification.
    // We indicate this with the @Subject annotation.
    @Subject 
    private Greet greet = new Greet(['Hi', 'Hello'])

    // Simple specification to test the greeting method.
    def "greeting should return a random salutation followed by name"() {
        when:
        final String greeting = greet.greeting('mrhaki')

        then:
        greeting == 'Hi, mrhaki' || greeting == 'Hello, mrhaki'
    }

}

/**
 * Class which is tested in the above specification.
 */
@groovy.transform.Immutable
class Greet {

    final List<String> salutations

    String greeting(final String name) {
        final int numberOfSalutations = salutations.size()
        final int selectedIndex = new Random().nextInt(numberOfSalutations)
        final String salutation = salutations.get(selectedIndex)

        "${salutation}, ${name}"
    }

}

Code written with Spock 0.7-groovy-2.0 and Groovy 2.3.7.

Monday, September 29, 2014

Gradle Goodness: Running Groovy Scripts as Application

In a previous post we learned how to run a Java application in a Gradle project. The Java source file with a main method is part of the project and we use the JavaExec task to run the Java code. We can use the same JavaExec task to run a Groovy script file.

A Groovy script file doesn't have an explicit main method, but it is added when we compile the script file. The name of the script file is also the name of the generated class, so we use that name for the main property of the JavaExec task. Let's first create simple Groovy script file to display the current date. We can pass an extra argument with the date format we wan't to use.

// File: src/main/groovy/com/mrhaki/CurrentDate.groovy
package com.mrhaki

// If an argument is passed we assume it is the
// date format we want to use.
// Default format is dd-MM-yyyy.
final String dateFormat = args ? args[0] : 'dd-MM-yyyy'

// Output formatted current date and time.
println "Current date and time: ${new Date().format(dateFormat)}"

Our Gradle build file contains the task runScript of type JavaExec. We rely on the Groovy libraries included with Gradle, because we use localGroovy() as a compile dependency. Of course we can change this to refer to another Groovy version if we want to using the group, name and version notation together with a valid repository.

// File: build.gradle
apply plugin: 'groovy'

dependencies {
    compile localGroovy()
}

task runScript(type: JavaExec) {
    description 'Run Groovy script'

    // Set main property to name of Groovy script class.
    main = 'com.mrhaki.CurrentDate'

    // Set classpath for running the Groovy script.
    classpath = sourceSets.main.runtimeClasspath

    if (project.hasProperty('custom')) {
        // Pass command-line argument to script.
        args project.getProperty('custom')
    }
}

defaultTasks 'runScript'

We can run the script with or without the project property custom and we see the changes in the output:

$ gradle -q
Current date and time: 29-09-2014
$ gradle -q -Pcustom=yyyyMMdd
Current date and time: 20140929
$ gradle -q -Pcustom=yyyy
Current date and time: 2014

Code written with Gradle 2.1.

Friday, September 19, 2014

Gradle Goodness: Adding Dependencies Only for Packaging to War

My colleague, Tom Wetjens, wrote a blog post Package-only dependencies in Maven. He showed a Maven solution when we want to include dependencies in the WAR file, which are not used in any other scopes. In this blog post we will see how we solve this in Gradle.

Suppose we use the SLF4J Logging API in our project. We use the API as a compile dependency, because our code uses this API. But in our test runtime we want to use the SLF4J Simple implementation of this API. And in our WAR file we want to include the Logback implementation of the API. The Logback dependency is only needed to be included in the WAR file and shouldn't exist in any other dependency configuration.

We first add the War plugin to our project. The war task uses the runtime dependency configuration to determine which files are added to the WEB-INF/lib directory in our WAR file. We add a new dependency configuration warLib that extends the runtime configuration in our project.

apply plugin: 'war'

repositories.jcenter()

configurations {
    // Create new dependency configuration
    // for dependencies to be added in 
    // WAR file.
    warLib.extendsFrom runtime
}

dependencies {
    // API dependency for Slf4j.
    compile 'org.slf4j:slf4j-api:1.7.7'

    testCompile 'junit:junit:4.11'

    // Slf4j implementation used for tests.
    testRuntime 'org.slf4j:slf4j-simple:1.7.7'

    // Slf4j implementation to be packaged
    // in WAR file.
    warLib 'ch.qos.logback:logback-classic:1.1.2'
}

war {
    // Add warLib dependency configuration
    classpath configurations.warLib

    // We remove all duplicate files
    // with this assignment.
    // geFiles() method return a unique
    // set of File objects, removing
    // any duplicates from configurations
    // added by classpath() method.
    classpath = classpath.files
}

We can now run the build task and we get a WAR file with the following contents:

$ gradle build
:compileJava UP-TO-DATE
:processResources UP-TO-DATE
:classes UP-TO-DATE
:war
:assemble
:compileTestJava
:processTestResources UP-TO-DATE
:testClasses
:test
:check
:build

BUILD SUCCESSFUL

Total time: 6.18 secs
$ jar tvf build/libs/package-only-dep-example.war
     0 Fri Sep 19 05:59:54 CEST 2014 META-INF/
    25 Fri Sep 19 05:59:54 CEST 2014 META-INF/MANIFEST.MF
     0 Fri Sep 19 05:59:54 CEST 2014 WEB-INF/
     0 Fri Sep 19 05:59:54 CEST 2014 WEB-INF/lib/
 29257 Thu Sep 18 14:36:24 CEST 2014 WEB-INF/lib/slf4j-api-1.7.7.jar
270750 Thu Sep 18 14:36:24 CEST 2014 WEB-INF/lib/logback-classic-1.1.2.jar
427729 Thu Sep 18 14:36:26 CEST 2014 WEB-INF/lib/logback-core-1.1.2.jar
   115 Wed Sep 03 09:24:40 CEST 2014 WEB-INF/web.xml

Also when we run the dependencies task we can see how the implementations of the SLF4J API relate to the dependency configurations:

$ gradle dependencies
:dependencies

------------------------------------------------------------
Root project
------------------------------------------------------------

archives - Configuration for archive artifacts.
No dependencies

compile - Compile classpath for source set 'main'.
\--- org.slf4j:slf4j-api:1.7.7

default - Configuration for default artifacts.
\--- org.slf4j:slf4j-api:1.7.7

providedCompile - Additional compile classpath for libraries that should not be part of the WAR archive.
No dependencies

providedRuntime - Additional runtime classpath for libraries that should not be part of the WAR archive.
No dependencies

runtime - Runtime classpath for source set 'main'.
\--- org.slf4j:slf4j-api:1.7.7

testCompile - Compile classpath for source set 'test'.
+--- org.slf4j:slf4j-api:1.7.7
\--- junit:junit:4.11
     \--- org.hamcrest:hamcrest-core:1.3

testRuntime - Runtime classpath for source set 'test'.
+--- org.slf4j:slf4j-api:1.7.7
+--- junit:junit:4.11
|    \--- org.hamcrest:hamcrest-core:1.3
\--- org.slf4j:slf4j-simple:1.7.7
     \--- org.slf4j:slf4j-api:1.7.7

warLib
+--- org.slf4j:slf4j-api:1.7.7
\--- ch.qos.logback:logback-classic:1.1.2
     +--- ch.qos.logback:logback-core:1.1.2
     \--- org.slf4j:slf4j-api:1.7.6 -> 1.7.7

(*) - dependencies omitted (listed previously)

BUILD SUCCESSFUL

Total time: 6.274 secs

Code written with Gradle 2.1.

Thursday, August 28, 2014

Awesome Asciidoc: Write Extensions Using Groovy (or Java)

We can write extension for Asciidoctor using the extension API in Groovy (or any other JVM language) if we run Asciidoctor on the Java platform using AsciidoctorJ. Extensions could also be written in Ruby of course, but in this post we see how to write a simple inline macro with Groovy.

The extension API has several extension points (Source):

  • Preprocessor: Processes the raw source lines before they are passed to the parser
  • Treeprocessor: Processes the Document (AST) once parsing is complete
  • Postprocessor: Processes the output after the Document has been rendered, before it's gets written to file
  • Block processor: Processes a block of content marked with a custom style (i.e., name) (equivalent to filters in AsciiDoc)
  • Block macro processor: Registers a custom block macro and process it (e.g., gist::12345[])
  • Inline macro processor: Registers a custom inline macro and process it (e.g., btn:[Save])
  • Include processor: Processes the include::[] macro

To write an extension in Groovy (or Java) we must write our implementation class for a specific extension point and we must register the class so AsciidoctorJ knows the class can be used. Registering the implementation is very simple, because it is using the Java Service Provider. This means we have to place a file in the META-INF/services directory on the classpath. The contents of the file is the class name of the implementation class.

Let's start with the Asciidoc markup and then write an implementation to process the inline macro twitter that is used:

= Groovy Inline Macro

Sample document to show extension for Asciidoctor written in Groovy.

// Here we use the twitter: macro.
// The implementation is done in Groovy.
With the twitter macro we can create links to the user's Twitter page like twitter:mrhaki[].

To implement an inline macro we create a new class and extend InlineMacroProcessor. We override the process method to return the value that needs to replace the inline macro in our Asciidoc markup.

// File: src/main/groovy/com/mrhaki/asciidoctor/extension/TwitterMacro.groovy
package com.mrhaki.asciidoctor.extension

import org.asciidoctor.extension.*
import org.asciidoctor.ast.*

import groovy.transform.CompileStatic

@CompileStatic
class TwitterMacro extends InlineMacroProcessor {

    TwitterMacro(final String name, final Map<String, Object> config) {
        super(name, config)
    }

    @Override
    protected Object process(final AbstractBlock parent, 
        final String twitterHandle, final Map<String, Object> attributes) {

        // Define options for an 'anchor' element.
        final Map options = [
            type: ':link',
            target: "http://www.twitter.com/${twitterHandle}".toString()
        ] as Map<String, Object>

        // Prepend twitterHandle with @ as text link.
        final Inline inlineTwitterLink = createInline(parent, 'anchor', "@${twitterHandle}", attributes, options)

        // Convert to String value.
        inlineTwitterLink.convert()
    }

}

We have the implementation class so now we can register the class with Asciidoctor. To register our custom extensions we need to implement the ExtensionRegistry interface. We implement the register method where we can couple our extension class to Asciidoctor.

// File: src/main/groovy/com/mrhaki/asciidoctor/extension/TwitterMacroExtension.groovy
package com.mrhaki.asciidoctor.extension

import org.asciidoctor.extension.spi.ExtensionRegistry
import org.asciidoctor.extension.JavaExtensionRegistry
import org.asciidoctor.Asciidoctor

import groovy.transform.CompileStatic

@CompileStatic
class TwitterMacroExtension implements ExtensionRegistry {

    @Override
    void register(final Asciidoctor asciidoctor) {
        final JavaExtensionRegistry javaExtensionRegistry = asciidoctor.javaExtensionRegistry()
        javaExtensionRegistry.inlineMacro 'twitter', TwitterMacro
    }

}

The class that registers our extension must be available via the Java Service Provider so it is automatically registered within the JVM used to run Asciidoctor. Therefore we need to create the file META-INF/services/org.asciidoctor.extension.spi.ExtensionRegistry with the following contents:

# File: src/main/resources/META-INF/services/org.asciidoctor.extension.spi.ExtensionRegistry
com.mrhaki.asciidoctor.extension.TwitterMacroExtension

We have taken all steps necessary to have our inline macro implementation. We must compile the Groovy classes and add those with the Java Service Provider file to the classpath. We can package the files in a JAR file and define a dependency on the JAR file in our project. If we use Gradle and the Gradle Asciidoctor plugin we can also add the source files to the buildSrc directory of our project. The files will be compiled and added to the classpath of the Gradle project.

With the following Gradle build file we can process Asciidoc markup and execute the twitter inline macro. We store the source files in the buildSrc directory.

buildscript {
    repositories {
        jcenter()
    }

    dependencies {
        classpath 'org.asciidoctor:asciidoctor-gradle-plugin:1.5.0'
    }
}

apply plugin: 'org.asciidoctor.gradle.asciidoctor'

The build file in the buildSrc directory has a dependency on AsciidoctorJ. This module makes it possible to run Asciidoctor on the JVM.

// File: buildSrc/build.gradle.
apply plugin: 'groovy'

repositories {
    jcenter()
}

dependencies {
    compile 'org.asciidoctor:asciidoctorj:1.5.0'
}

Let's see part of the HTML that is generated if we transform the Asciidoc markup that is shown at the beginning of this blog post. The twitter inline macro is transformed into a link to the Twitter page of the user:

...
<div class="paragraph">
<p>With the twitter macro we can create links to the user’s Twitter page like <a href="http://www.twitter.com/mrhaki">@mrhaki</a>.</p>
</div>
...

Andres Almiray also wrote about writing extensions with Gradle.

Written with Asciidoctor 1.5.0 and Gradle 2.0.

Tuesday, August 26, 2014

Awesome Asciidoc: Conditional Directive to Check If Document is On GitHub

In a previous blog post we learned about the conditional directives in Asciidoctor. Dan Allen mentioned a conditional directive that we can use to see if the document is used on GitHub. The conditional directive is called env-github.

We have the following Asciidoc markup for a document stored on GitHub:

:blogpost: http://mrhaki.blogspot.com/2014/08/awesome-asciidoc-check-if-document-is.html

= Asciidoc on GitHub

Sample document for {blogpost}[Aweseome Asciidoc blog post].

ifdef::env-github[]
This line is only visible if the document is on GitHub.
GitHub is using Asciidoctor {asciidoctor-version}.
endif::env-github[]

ifndef::env-github[This line is visible if not rendered on GitHub.]

To see what is rendered we can view the document on GitHub.

Written with Asciidoctor 1.5.0.

Friday, August 15, 2014

Awesome Asciidoc: Changing the FontAwesome CSS Location

To use font icons from FontAwesome we set the document attribute icons with the value font. The default link to the CSS location is https://cdnjs.cloudflare.com/ajax/libs/font-awesome/4.1.0/css/font-awesome.min.css. We can change the location for the FontAwesome CSS with document attributes.

If we want to use a different CDN to serve the CSS we can set the document attribute iconfont-cdn and set the URI as a value:

:icons: font

// Set new URI for reference to FontAwesome CSS
:iconfont-cdn: //maxcdn.bootstrapcdn.com/font-awesome/4.1.0/css/font-awesome.min.css

== Sample doc

To reference the FontAwesome CSS from a relative location from our generated HTML page we can first unset the attribute iconfont-remote and set the attribute iconfont-name:

:icons: font

// First unset attribute to remotely link FontAwesome CSS
:iconfont-remote!:

// Specify name of FontAwesome CSS.
:iconfont-name: fontawesome-4.1.0

// We can optionally set the directory where CSS is stored.
:stylesdir: css

== Sample doc

In the generated HTML source we see the following link element:

...
<link rel="stylesheet" href="css/fontawesome-4.1.0.css">
...

Written with Asciidoctor 1.5.0.

Awesome Asciidoc: Change URI Scheme for Assets

When we define the document attribute icons with the value font the FontAwesome fonts are loaded in the generated HTML page. In the head section of the HTML document a link element to the FontAwesome CSS on https://cdnjs.cloudflare.com/ajax/libs is added. Also when we use the highlight.js or Prettify source highlighter a link to the Javascript files on the cdnjs.cloudflare.com server is generated. We can change the value of the scheme from https to http by setting the attribute asset-uri-scheme to http. Or we can leave out the scheme so a scheme-less URI is generated for the links. A scheme-less URI provides the benefit that the same protocol of the origin HTML page is used to get the CSS or Javascript files from the cdnjs.cloudflare.com server. Remember this might provide a problem if the HTML page is opened locally.

In the next sample Asciidoc markup we change the scheme to http:

:asset-uri-scheme: http
:icons:font

== Asset URI Scheme

Sample document.

In the generated HTML we see the new scheme value:

<link rel="stylesheet" href="http://cdnjs.cloudflare.com/ajax/libs/font-awesome/4.1.0/css/font-awesome.min.css">

Now we leave the value of the asset-uri-scheme attribute empty:

:asset-uri-scheme: 
:icons:font

== Asset URI Scheme

Sample document.

The generated HTML now contains a link to the FontAwesome CSS with a scheme-less URI:

<link rel="stylesheet" href="//cdnjs.cloudflare.com/ajax/libs/font-awesome/4.1.0/css/font-awesome.min.css">

Written with Asciidoctor 1.5.0.