Search

December 10, 2019

Awesome Asciidoctor: Auto Number Callouts

In a previous post we learned about callouts in Asciidoctor to add explanation to source code. While surfing the Internet I came upon the following blog post by Alex Soto: Auto-numbered Callouts in Asciidoctor. I turns out that since Asciidoctor 1.5.8 we can use a dot (.) instead of explicit numbers to have automatic increasing numbering for the callouts.

Let's take our example from the earlier blog post and now use auto numbered callouts:

= Callout sample
:source-highlighter: prettify
:icons: font

[source,groovy]
----
package com.mrhaki.adoc

class Sample {
    String username // <.>

    String toString() {
        "${username?.toUpperCase() ?: 'not-defined'}" // <.>
    }
}
----
<.> Simple property definition where Groovy will generate the `setUsername` and `getUsername` methods.
<.> Return username in upper case if set, otherwise return `not-defined`.

When we convert this markup to HTML we get the following result:

This makes it very easy to add new callouts without having to change all numbers we first typed by hand.

Written with Asciidoctor 2.0.9

September 11, 2019

Quickly Find Unicode For Character On macOS

Sometimes when we are developing we might to need to lookup the unicode value for a character. If we are using macOS we can use the Character Viewer to lookup the unicode. We can open the Character Viewer using the key combination ⌃+⌘+Space (Ctrl+Cmd+Space) or open the Edit menu in our application and select Emoji & Symbols. We can type the character we want to unicode value for in the Search box or look it up in the lists. When we select the character we can see at the right the Unicode for that character:

If the Unicode value is not shown we first need to add the Unicode code table to the Character Viewer. We must select the menu at the top left and click on Customise List....:

Next we must select the Code Tables node and select Unicode:

When we search or click on a new character we get the unicode value.

September 10, 2019

Spocklight: Use Stub or Mock For Spring Component Using @SpringBean

When we write tests or specifications using Spock for our Spring Boot application, we might want to replace some Spring components with a stub or mock version. With the stub or mock version we can write expected outcomes and behaviour in our specifications. Since Spock 1.2 and the Spock Spring extension we can use the @SpringBean annotation to replace a Spring component with a stub or mock version. (This is quite similar as the @MockBean for Mockito mocks that is supported by Spring Boot). We only have to declare a variable in our specification of the type of the Spring component we want to replace. We directly use the Stub() or Mock() methods to create the stub or mock version when we define the variable. From now on we can describe expected output values or behaviour just like any Spock stub or mock implementation.

To use the @SpringBean annotation we must add a dependency on spock-spring module to our build system. For example if we use Gradle we use the following configuration:

...
dependencies {
    ...
    testImplementation("org.spockframework:spock-spring:1.3-groovy-2.5")
    ...
}
...

Let's write a very simple Spring Boot application and use the @SpringBean annotation to create a stubbed component. First we write a simple interface with a method that accepts an argument of type String and return a new String value:

package mrhaki.spock;

public interface MessageComponent {
    String hello(final String name);
}

Next we use this interface in a Spring REST controller where we use constructor dependency injection to inject the correct implementation of the MessageComponent interface into the controller:

package mrhaki.spock;

import org.springframework.http.MediaType;
import org.springframework.web.bind.annotation.GetMapping;
import org.springframework.web.bind.annotation.RequestParam;
import org.springframework.web.bind.annotation.RestController;

@RestController
public class MessageController {

    private final MessageComponent messageComponent;

    public MessageController(final MessageComponent messageComponent) {
        this.messageComponent = messageComponent;
    }

    @GetMapping(path = "/message", produces = MediaType.TEXT_PLAIN_VALUE)
    public String message(@RequestParam final String name) {
        return messageComponent.hello(name);
    }

}

To test the controller we write a new Spock specification. We use Spring's MockMvc support to test our controller, but the most important part in the specification is the declaration of the variable messageComponent with the annotation @SpringBean. Inside the method where we invoke /message?name=mrhaki we use the stub to declare our expected output:

package mrhaki.spock

import org.spockframework.spring.SpringBean
import org.springframework.beans.factory.annotation.Autowired
import org.springframework.boot.test.autoconfigure.web.servlet.WebMvcTest
import org.springframework.test.web.servlet.MockMvc
import spock.lang.Specification

import static org.springframework.test.web.servlet.request.MockMvcRequestBuilders.get

@WebMvcTest(MessageController)
class MessageControllerSpec extends Specification {

    @Autowired
    private MockMvc mockMvc

    /**
     * Instead of the real MessageComponent implementation
     * in the application context we want to use our own
     * Spock Stub implementation to have control over the
     * output of the message method.
     */
    @SpringBean
    private MessageComponent messageComponent = Stub()

    void "GET /message?name=mrhaki should return result of MessageComponent using mrhaki as argument"() {
        given: 'Stub returns value if invoked with argument "mrhaki"'
        messageComponent.hello("mrhaki") >> "Hi mrhaki"

        when: 'Get response for /message?name=mrhaki'
        final response = mockMvc.perform(get("/message").param("name", "mrhaki"))
                                .andReturn()
                                .getResponse()

        then:
        response.contentAsString == "Hi mrhaki"
    }
}

Written with Spock 1.3-groovy-2.5 and Spring Boot 2.1.8.RELEASE.

Java Joy: Transform Elements In Stream Using a Collector

Using the Stream API and the map method we can transform elements in a stream to another object. Instead of using the map method we can also write a custom Collector and transform the elements when we use the collect method as terminal operation of the stream.

First we have an example where we transform String value using the map method:

package mrhaki;

import java.util.List;
import java.util.stream.Collectors;

public class CollectorString {
    public static void main(String[] args) {
        final var items = List.of("JFall", "JavaZone", "CodeOne");

        final List<String> upper =
                items.stream()
                     .map(String::toUpperCase)
                     .collect(Collectors.toUnmodifiableList());
        
        assert upper.equals(List.of("JFALL", "JAVAZONE", "CODEONE"));
    }
}

In our next example we don't use the map method, but we write a custom Collector using the Collector.of method. As first argument we must provide the data structure we want to add elements too, the so-called supplier, which is an ArrayList. The second argument is an accumulator where we add each element from the stream to the list and transform the value. The third argument is the combiner and here we combine multiple List instances to one List instance. The last argument is a finisher and we make an immutable List to be returned.

package mrhaki;

import java.util.ArrayList;
import java.util.Collections;
import java.util.List;
import java.util.stream.Collector;

public class CollectorString1 {
    public static void main(String[] args) {
        final var items = List.of("JFall", "JavaZone", "CodeOne");

        final List<String> upper =
                items.stream()
                     // Use collector to transform values
                     // in the items List.
                     .collect(upperCollect());

        assert upper.equals(List.of("JFALL", "JAVAZONE", "CODEONE"));
    }

    private static Collector<String, ?, List<String>> upperCollect() {
        return Collector.of(
                // First we specify that we want to add
                // each element from the stream to an ArrayList.
                () -> new ArrayList<String>(),

                // Next we add each String value to the list
                // and turn it into an uppercase value.
                (list, value) -> list.add(value.toUpperCase()),

                // Next we get two lists we need to combine,
                // so we add the values of the second list
                // to the first list.
                (first, second) -> { first.addAll(second); return first; },

                // Finally (and optionally) we turn the 
                // ArrayList into an unmodfiable List.
                list -> Collections.unmodifiableList(list));
    }
}

Written with Java 12.

September 9, 2019

Java Joy: Combining Predicates

In Java we can use a Predicate to test if something is true or false. This is especially useful when we use the filter method of the Java Stream API. We can use lambda expressions to define our Predicate or implement the Predicate interface. If we want to combine different Predicate objects we can use the or, and and negate methods of the Predicate interfaces. These are default methods of the interface and will return a new Predicate.

Let's start with an example where we have a list of String values. We want to filter all values that start with Gr or with M. In our first implementation we use a lambda expression as Predicate and implements both tests in this expression:

package mrhaki;

import java.util.List;
import java.util.stream.Collectors;

public class PredicateComposition1 {
    public static void main(String[] args) {
        final var items = List.of("Groovy", "Gradle", "Grails", "Micronaut", "Java", "Kotlin");

        final List<String> gr8Stuff =
                items.stream()
                     // Use lambda expression with both tests as Predicate.
                     .filter(s -> s.startsWith("Gr") || s.startsWith("M"))
                     .collect(Collectors.toUnmodifiableList());

        assert gr8Stuff.size() == 4 : "gr8Stuff contains 4 items";
        assert gr8Stuff.contains("Groovy");
        assert gr8Stuff.contains("Gradle");
        assert gr8Stuff.contains("Grails");
        assert gr8Stuff.contains("Micronaut");
    }
}

We will rewrite the previous example and introduce the startsWith method that returns a new Predicate. Then in our filter method we use the or method of the Predicate object to combine the two Predicate objects:

package mrhaki;

import java.util.List;
import java.util.function.Predicate;
import java.util.stream.Collectors;

public class PredicateComposition2 {
    public static void main(String[] args) {
        final var items = List.of("Groovy", "Gradle", "Grails", "Micronaut", "Java", "Kotlin");

        final List<String> gr8Stuff =
                items.stream()
                     // Use the Predicate.or method to combine two Predicate objects.
                     .filter(startsWith("Gr").or(startsWith("M")))
                     .collect(Collectors.toUnmodifiableList());

        assert gr8Stuff.size() == 4 : "gr8Stuff contains 4 items";
        assert gr8Stuff.contains("Groovy");
        assert gr8Stuff.contains("Gradle");
        assert gr8Stuff.contains("Grails");
        assert gr8Stuff.contains("Micronaut");
    }

    // Create a predicate to check if String value starts with a given value.
    private static Predicate<String> startsWith(final String begin) {
        return s -> s.startsWith(begin);
    }
}

In the following example we use the negate and and method to find all values that do not start with Gr and with a length less than 8 characters:

package mrhaki;

import java.util.List;
import java.util.function.Predicate;
import java.util.stream.Collectors;

public class PredicateComposition3 {
    public static void main(String[] args) {
        final var items = List.of("Groovy", "Gradle", "Grails", "Micronaut", "Java", "Kotlin");

        final List<String> otherStuff =
                items.stream()
                     // Find all values that do not start with "Gr"
                     // and have less than 8 characters.
                     .filter(startsWith("Gr").negate().and(smallerThan(8)))
                     .collect(Collectors.toUnmodifiableList());

        assert otherStuff.size() == 2 : "otherStuff contains 2 items";
        assert otherStuff.contains("Java");
        assert otherStuff.contains("Kotlin");
    }

    // Create a predicate to check if String value starts with a given value.
    private static Predicate<String> startsWith(final String begin) {
        return s -> s.startsWith(begin);
    }

    // Create a predicate to check if String value has 
    // less characters than the given size.
    private static Predicate<String> smallerThan(final int size) {
        return s -> size >= s.length();
    }
}

In our previous example we can replace the negate method call on our predicate with the static Predicate.not method. The predicate is than an argument and is just another way to express the same predicate:

package mrhaki;

import java.util.List;
import java.util.function.Predicate;
import java.util.stream.Collectors;

public class PredicateComposition4 {
    public static void main(String[] args) {
        final var items = List.of("Groovy", "Gradle", "Grails", "Micronaut", "Java", "Kotlin");

        final List<String> otherStuff =
                items.stream()
                     // Find all values that do not start with "Gr",
                     // using Predicate.not instead of negate, 
                     // and have less than 8 characters.
                     .filter(Predicate.not(startsWith("Gr")).and(smallerThan(8)))
                     .collect(Collectors.toUnmodifiableList());

        assert otherStuff.size() == 2 : "otherStuff contains 2 items";
        assert otherStuff.contains("Java");
        assert otherStuff.contains("Kotlin");
    }

    // Create a predicate to check if String value starts with a given value.
    private static Predicate<String> startsWith(final String begin) {
        return s -> s.startsWith(begin);
    }

    // Create a predicate to check if String value has 
    // less characters than the given size.
    private static Predicate<String> smallerThan(final int size) {
        return s -> size >= s.length();
    }
}

Written with Java 12.

Gradle Goodness: Stop Build After One Failing Test

Normally when we run tests in our Gradle build, all our tests are executed and at the end we can see which tests are failing. But what if we want to let the build fail at the first failing test? Especially for a large test suite this can save a lot of time, because we don't have to run all (failing) tests, we immediately get informed that at least one test is failing.

We can do this by passing the command-line option --fail-fast when we run the test task in Gradle. With this option Gradle will stop the build and report a failure at the first failing test. Instead of passing the command-line option --fail-fast we can set the property failFast of the test task to true. Using the property failFast allows to still fail the build on the first failing test even if we for example run a build task that depends on the test task. The command-line option --fail-fast only works if we run the test task directly, not if it is part of the task graph for our build when we run another task.

In the following sample we have two failing tests in our project. When we run our test task we see this in the output of our Gradle build:

$ gradle test

mrhaki.gradle.GreetingSpec > greeting should return 'Hello' followed by input FAILED
    org.spockframework.runtime.SpockComparisonFailure at GreetingSpec.groovy:12

mrhaki.gradle.HelloSpec > message should return 'Hello' FAILED
    org.spockframework.runtime.SpockComparisonFailure at HelloSpec.groovy:12

2 tests completed, 2 failed

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':test'.
> There were failing tests. See the report at: file:///Users/mrhaki/Projects/mrhaki.com/blog/posts/samples/gradle/testfast/build/reports/tests/test/index.html

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 1s
3 actionable tasks: 1 executed, 2 up-to-date

Now let's run the test task with the command-line option --fail-fast:

$ gradle test --fail-fast -q

mrhaki.gradle.GreetingSpec > greeting should return 'Hello' followed by input FAILED
    org.spockframework.runtime.SpockComparisonFailure at GreetingSpec.groovy:12

2 tests completed, 2 failed

FAILURE: Build failed with an exception.

* What went wrong:
Execution failed for task ':test'.
> There were failing tests. See the report at: file:///Users/mrhaki/Projects/mrhaki.com/blog/posts/samples/gradle/testfast/build/reports/tests/test/index.html

* Try:
Run with --stacktrace option to get the stack trace. Run with --info or --debug option to get more log output. Run with --scan to get full insights.

* Get more help at https://help.gradle.org

BUILD FAILED in 1s
3 actionable tasks: 1 executed, 2 up-to-date

Although the totals indicate 2 tests completed, 2 failed when we use the fail fast option we can see in the output the test in the HelloSpec specification is not run this time.

Instead of using the command-line option --fail-fast we can set the property failFast in our test task, like in the following example Gradle build:

...
test {
   failFast = true
}
...

Written with Gradle 5.6.2.

May 24, 2019

Awesome Asciidoctor: Include Asciidoc Markup With Listing or Literal Blocks Inside Listing or Literal Block

If we want to include Asciidoc markup as source language and show the markup without transforming it we can use a listing or literal block.
For example we are using Asciidoc markup to write a document about Asciidoctor and want to include some Asciidoc markup examples.
If the markup contains sections like a listing or literal block and it is enclosed in a listing or literal block, the tranformation goes wrong.
Because the beginning of the included listing or literal block is seen as the ending of the enclosing listing or literal block.
Let's see what goes wrong with an example where we have the following Asciidoc markup:

In the following example we see a listing block that can be defined in Asciidoc markup.

[source,asciidoc]
----
= Sample document

We can include listing blocks which are ideal for including source code.

.A listing block
----
public class Sample { }
----

.A literal block
....
public class Sample { }
....

Asciidoctor awesomeness!
----

When we transform this to HTML we get the following output:

We can use nested listing or literal blocks where we have to add an extra hyphen or dot to the nested block, but then the Asciidoc markup we want to show as an example is not correct anymore.
It turns out we can also add an extra hyphen or dot to the enclosing listing or literal block to transform our markup correctly.
So in our example we add an extra hyphen to the outer listing block:

In the following example we see a listing block that can be defined in Asciidoc markup.

[source,asciidoc]
// This time we have 5 hyphens.
-----
= Sample document

We can include listing blocks which are ideal for including source code.

.A listing block
----
public class Sample { }
----

.A literal block
....
public class Sample { }
....

Asciidoctor awesomeness!
-----

The transformed HTML looks like this:

If our sample markup we want to show only contains a listing block we could have enclosed it in a literal block to get the same result or sample markup of a literal block could be in a listing block.
But in our example we have both a listing and literal block so we needed another solution to get the desired result.

Written with Asciidoctor 2.0.9.

May 22, 2019

Awesome Asciidoctor: Don't Wrap Lines in All Listing or Literal Blocks of Document

In a previous post we learned about setting the value of options attribute to nowrap on listing and literal blocks, so the lines in the block are not wrapped. In the comments a user suggested another option to disable line wrapping for all listing and literal blocks in the document by using the document attribute prewrap. We must negate the document attribute, :prewrap!:, to disable all wrapping. If we place this document attribute at the top of our Asciidoctor document it is applied for the whole document. We can also place it at other places in the document to change the setting for all listing and literal blocks following the prewrap document attribute. To enable wrapping again we set :prewrap: (leaving out the exclamation mark).

In the following example we have markup with a listing, literal and example block and we use the document attribute :prewrap!: to disable the wrapping for the listing and literal blocks:

= Wrapping lines sample

// Disable wrapping in listing and literal blocks.
:prewrap!:

== Listing

[.groovy]
----
class GroovyHelloWorld {
    String sayHello(final String message = 'world') { // Set default argument value for message argument
        "Hello $message !"
     }
}
----

== Literal

....
class GroovyHelloWorld {
    String sayHello(final String message = 'world') { // Set default argument value for message argument
        "Hello $message !"
     }
}
....

== Example

====
// We use spacing to mimic a code block.
class GroovyHelloWorld { +
{nbsp}{nbsp}{nbsp}{nbsp}String sayHello(final String message = 'world') { // Set default argument value for message argument +
{nbsp}{nbsp}{nbsp}{nbsp}{nbsp}{nbsp}{nbsp}{nbsp}"Hello $message !" +
{nbsp}{nbsp}{nbsp}{nbsp}} +
}
====

When we create HTML from this markup we get the following output:

The code in the listing and literal blocks is now not wrapped, but continues on the same line.

Written with Asciidoctor 2.0.2.

April 19, 2019

Gradle Goodness: Use bill of materials (BOM) As Dependency Constraints

Since Gradle 5 we can easily use a bill of materials (BOM) in our build file to get recommended dependency versions. The dependency versions defined in the BOM are dependency constraints in Gradle. This means the dependencies we define in our build that are part of the BOM don't need a version, because the version is resolved via the dependency constraint that is defined in the BOM. Also transitive dependency versions are resolved using the BOM if applicable. We use the dependency handler method platform to define the BOM we want to import. The versions in the BOM are recommendations. We can override the recommendation by specifying the version for a dependency found in the BOM with an explicit version.

In the following example build file we import the BOM for Spring Boot 1.2.4.RELEASE. We also add two dependencies that are defined in the BOM: commons-codec:commons-codec and org.yaml:snakeyaml. One without a version and one with an explicit version to override the version defined in the BOM:

// File: build.gradle.kts
plugins {
    java
}

repositories {
    jcenter()
}

dependencies {
    // Load bill of materials (BOM) for Spring Boot.
    // The dependencies in the BOM will be 
    // dependency constraints in our build.
    implementation(platform("org.springframework.boot:spring-boot-dependencies:2.1.4.RELEASE"))

    // Use dependency defined in BOM.
    // Version is not needed, because the version
    // defined in the BOM is a dependency constraint
    // that is used.
    implementation("commons-codec:commons-codec")

    // Override version for dependency in the BOM.
    // Version in BOM is 1.23.
    implementation(group = "org.yaml", 
                   name = "snakeyaml", 
                   version = "1.24")
}

When we run the dependencies task for the configuration compileClasspath we can see how the dependencies are resolved. Notice that version 1.24 for snakeyaml is used, while the version in the BOM is 1.23:

$ gradle -q dependencies --configuration compileClasspath
------------------------------------------------------------
Root project
------------------------------------------------------------

compileClasspath - Compile classpath for source set 'main'.
+--- org.springframework.boot:spring-boot-dependencies:2.1.4.RELEASE
|    +--- commons-codec:commons-codec:1.11 (c)
|    \--- org.yaml:snakeyaml:1.23 -> 1.24 (c)
+--- commons-codec:commons-codec -> 1.11
\--- org.yaml:snakeyaml:1.24

(c) - dependency constraint
A web-based, searchable dependency report is available by adding the --scan option.
$

The dependency handler method enforcedPlatform is also available. When we use this method to import a BOM in our build the versions of dependencies we use are forced for the dependencies we use. Even if we define an explicit version on a dependency found in the BOM, the version is forced that is described in the BOM.

// File: build.gradle.kts
plugins {
    java
}

repositories {
    jcenter()
}

dependencies {
    // Load bill of materials (BOM) for Spring Boot.
    // The dependencies in the BOM will be 
    // dependency constraints in our build, but
    // the versions in the BOM are forced for
    // used dependencies.
    implementation(enforcedPlatform("org.springframework.boot:spring-boot-dependencies:2.1.4.RELEASE"))

    // Use dependency defined in BOM.
    // Version is not needed, because the version
    // defined in the BOM is a dependency constraint
    // that is used.
    implementation("commons-codec:commons-codec")

    // Version in BOM is 1.23 and because
    // we use enforcedPlatform the version
    // will be 1.23 once the dependency is resolved,
    // even though we define a newer version explicitly.
    implementation(group = "org.yaml", 
                   name = "snakeyaml", 
                   version = "1.24")
}

Now we look at the resolved dependencies and see how the snakeyaml dependency is forced to use version 1.23:

$ gradle -q dependencies --configuration compileClasspath
------------------------------------------------------------
Root project
------------------------------------------------------------

compileClasspath - Compile classpath for source set 'main'.
+--- org.springframework.boot:spring-boot-dependencies:2.1.4.RELEASE
|    +--- commons-codec:commons-codec:1.11 (c)
|    \--- org.yaml:snakeyaml:1.23 (c)
+--- commons-codec:commons-codec -> 1.11
\--- org.yaml:snakeyaml:1.24 -> 1.23

(c) - dependency constraint
A web-based, searchable dependency report is available by adding the --scan option.
$

Read more about the Gradle BOM support on the Gradle website.

Written with Gradle 5.4.

Gradle Goodness: Manage Dependency Versions With Dependency Constraints

From Maven builds we know the dependencyManagement section in our POM file. In the section we can describe dependencies with their version and later in the dependencies section we can refer to the dependency without the version. We can use dependency constraints in Gradle to do the same thing. A dependency constraint can be used to define the version or version range for a dependency defined in our scripts or a transitive dependency. Just like a dependency the dependency constraint is defined for a configuration, so we can fine tune the constraints to the correct configuration.

Using dependency constraints in a multi-project build allows us to define the dependency versions in the root build file and define project dependencies per project without a version. The version will then be used from the dependency constraint we defined in the root build file.

In the following example build script we define two dependency constraints for different configurations:

// File: build.gradle.kts
plugins {
    groovy
}

repositories {
    jcenter()
}

// In a multi-project build, this dependencies
// block with constraints could be in the 
// root build file, so all versions are 
// defined in one place.
dependencies {
    constraints {
        // Define dependency with version to be used.
        // This version is used when we define a dependency
        // for guava without a version.
        implementation("com.google.guava:guava:27.1-jre")

        // Constraints are scoped to configurations,
        // so we can make specific constraints for a configuration.
        // In this case we want the dependency on Spock defined
        // in the testImplementation configuration to be a specific version.
        // Here we use named arguments to define the dependency constraint.
        testImplementation(group = "org.spockframework", 
                           name = "spock-core", 
                           version = "1.3-groovy-2.5")
    }
}

// In a multi-project build this dependencies block
// could be in subprojects, where the dependency
// declarations do not need a version, because the
// versions are defined in the root build file using 
// constraints.
dependencies {
    // Because of the dependency constraint, 
    // we don't have to specify the dependency version here.
    implementation("com.google.guava:guava")

    // Another dependency without version for the 
    // testImplementation configuration.
    testImplementation("org.spockframework:spock-core")
}

Let's run the dependencies task for the configuration compileClasspath to see how the dependencies are resolved:

$ gradle -q dependencies --configuration compileClasspath
------------------------------------------------------------
Root project
------------------------------------------------------------

compileClasspath - Compile classpath for source set 'main'.
+--- com.google.guava:guava -> 27.1-jre
|    +--- com.google.guava:failureaccess:1.0.1
|    +--- com.google.guava:listenablefuture:9999.0-empty-to-avoid-conflict-with-guava
|    +--- com.google.code.findbugs:jsr305:3.0.2
|    +--- org.checkerframework:checker-qual:2.5.2
|    +--- com.google.errorprone:error_prone_annotations:2.2.0
|    +--- com.google.j2objc:j2objc-annotations:1.1
|    \--- org.codehaus.mojo:animal-sniffer-annotations:1.17
\--- com.google.guava:guava:27.1-jre (c)

(c) - dependency constraint
A web-based, searchable dependency report is available by adding the --scan option.
$

We see that the Guava dependency is resolved for version 27.1-jre. In the output is also shown with (c) that a dependency constraint is used to resolve the dependency.

When we look at the testCompileClasspath configuration we see the following output:

$ gradle -q dependencies --configuration testCompileClasspath
> Task :dependencies

------------------------------------------------------------
Root project
------------------------------------------------------------

testCompileClasspath - Compile classpath for source set 'test'.
+--- com.google.guava:guava -> 27.1-jre
|    +--- com.google.guava:failureaccess:1.0.1
|    +--- com.google.guava:listenablefuture:9999.0-empty-to-avoid-conflict-with-guava
|    +--- com.google.code.findbugs:jsr305:3.0.2
|    +--- org.checkerframework:checker-qual:2.5.2
|    +--- com.google.errorprone:error_prone_annotations:2.2.0
|    +--- com.google.j2objc:j2objc-annotations:1.1
|    \--- org.codehaus.mojo:animal-sniffer-annotations:1.17
+--- com.google.guava:guava:27.1-jre (c)
+--- org.spockframework:spock-core:1.3-groovy-2.5
|    +--- org.codehaus.groovy:groovy:2.5.4
|    +--- org.codehaus.groovy:groovy-json:2.5.4
|    |    \--- org.codehaus.groovy:groovy:2.5.4
|    +--- org.codehaus.groovy:groovy-nio:2.5.4
|    |    \--- org.codehaus.groovy:groovy:2.5.4
|    +--- org.codehaus.groovy:groovy-macro:2.5.4
|    |    \--- org.codehaus.groovy:groovy:2.5.4
|    +--- org.codehaus.groovy:groovy-templates:2.5.4
|    |    +--- org.codehaus.groovy:groovy:2.5.4
|    |    \--- org.codehaus.groovy:groovy-xml:2.5.4
|    |         \--- org.codehaus.groovy:groovy:2.5.4
|    +--- org.codehaus.groovy:groovy-test:2.5.4
|    |    +--- org.codehaus.groovy:groovy:2.5.4
|    |    \--- junit:junit:4.12
|    |         \--- org.hamcrest:hamcrest-core:1.3
|    +--- org.codehaus.groovy:groovy-sql:2.5.4
|    |    \--- org.codehaus.groovy:groovy:2.5.4
|    +--- org.codehaus.groovy:groovy-xml:2.5.4 (*)
|    \--- junit:junit:4.12 (*)
\--- org.spockframework:spock-core -> 1.3-groovy-2.5 (*)

(c) - dependency constraint
(*) - dependencies omitted (listed previously)

A web-based, searchable dependency report is available by adding the --scan option.
$

Read more about Gradle dependency constraints on the Gradle website.

Written with Gradle 5.4.