Appendices
Having trouble with Spring Cloud Data Flow, We’d like to help!
-
Ask a question. We monitor stackoverflow.com for questions tagged with
spring-cloud-dataflow
. -
Report bugs with Spring Cloud Data Flow at github.com/spring-cloud/spring-cloud-dataflow/issues.
Appendix A: Data Flow Template
As described in API Guide chapter, Spring Cloud Data Flow’s functionality is completely exposed through REST endpoints. While you can use those endpoints directly, Spring Cloud Data Flow also provides a Java-based API, which makes using those REST endpoints even easier.
The central entry point is the DataFlowTemplate
class in the org.springframework.cloud.dataflow.rest.client
package.
This class implements the DataFlowOperations
interface and delegates to the following sub-templates that provide the specific functionality for each feature-set:
Interface | Description |
---|---|
|
REST client for stream operations |
|
REST client for counter operations |
|
REST client for field value counter operations |
|
REST client for aggregate counter operations |
|
REST client for task operations |
|
REST client for job operations |
|
REST client for app registry operations |
|
REST client for completion operations |
|
REST Client for runtime operations |
When the DataFlowTemplate
is being initialized, the sub-templates can be discovered through the REST relations, which are provided by HATEOAS (Hypermedia as the Engine of Application State).
If a resource cannot be resolved, the respective sub-template results in NULL. A common cause is that Spring Cloud Data Flow allows for specific sets of features to be enabled or disabled when launching. For more information, see one of the local, Cloud Foundry, or Kubernetes configuration chapters, depending on where you deploy your application. |
A.1. Using the Data Flow Template
When you use the Data Flow Template, the only needed Data Flow dependency is the Spring Cloud Data Flow Rest Client, as shown in the following Maven snippet:
<dependency>
<groupId>org.springframework.cloud</groupId>
<artifactId>spring-cloud-dataflow-rest-client</artifactId>
<version>2.7.2</version>
</dependency>
With that dependency, you get the DataFlowTemplate
class as well as all the dependencies needed to make calls to a Spring Cloud Data Flow server.
When instantiating the DataFlowTemplate
, you also pass in a RestTemplate
.
Note that the needed RestTemplate
requires some additional configuration to be valid in the context of the DataFlowTemplate
.
When declaring a RestTemplate
as a bean, the following configuration suffices:
@Bean
public static RestTemplate restTemplate() {
RestTemplate restTemplate = new RestTemplate();
restTemplate.setErrorHandler(new VndErrorResponseErrorHandler(restTemplate.getMessageConverters()));
for(HttpMessageConverter<?> converter : restTemplate.getMessageConverters()) {
if (converter instanceof MappingJackson2HttpMessageConverter) {
final MappingJackson2HttpMessageConverter jacksonConverter =
(MappingJackson2HttpMessageConverter) converter;
jacksonConverter.getObjectMapper()
.registerModule(new Jackson2HalModule())
.addMixIn(JobExecution.class, JobExecutionJacksonMixIn.class)
.addMixIn(JobParameters.class, JobParametersJacksonMixIn.class)
.addMixIn(JobParameter.class, JobParameterJacksonMixIn.class)
.addMixIn(JobInstance.class, JobInstanceJacksonMixIn.class)
.addMixIn(ExitStatus.class, ExitStatusJacksonMixIn.class)
.addMixIn(StepExecution.class, StepExecutionJacksonMixIn.class)
.addMixIn(ExecutionContext.class, ExecutionContextJacksonMixIn.class)
.addMixIn(StepExecutionHistory.class, StepExecutionHistoryJacksonMixIn.class);
}
}
return restTemplate;
}
You can also get a pre-configured RestTemplate by using
DataFlowTemplate.getDefaultDataflowRestTemplate();
|
Now you can instantiate the DataFlowTemplate
with the following code:
DataFlowTemplate dataFlowTemplate = new DataFlowTemplate(
new URI("http://localhost:9393/"), restTemplate); (1)
1 | The URI points to the ROOT of your Spring Cloud Data Flow Server. |
Depending on your requirements, you can now make calls to the server. For instance, if you want to get a list of the currently available applications, you can run the following code:
PagedResources<AppRegistrationResource> apps = dataFlowTemplate.appRegistryOperations().list();
System.out.println(String.format("Retrieved %s application(s)",
apps.getContent().size()));
for (AppRegistrationResource app : apps.getContent()) {
System.out.println(String.format("App Name: %s, App Type: %s, App URI: %s",
app.getName(),
app.getType(),
app.getUri()));
}
A.2. Data Flow Template and Security
When using the DataFlowTemplate
, you can also provide all the security-related
options as if you were using the Data Flow Shell. In fact, the Data Flow Shell
uses the DataFlowTemplate
for all its operations.
To let you get started, we provide a HttpClientConfigurer
that uses the builder
pattern to set the various security-related options:
HttpClientConfigurer
.create(targetUri) (1)
.basicAuthCredentials(username, password) (2)
.skipTlsCertificateVerification() (3)
.withProxyCredentials(proxyUri, proxyUsername, proxyPassword) (4)
.addInterceptor(interceptor) (5)
.buildClientHttpRequestFactory() (6)
1 | Creates a HttpClientConfigurer with the provided target URI. |
2 | Sets the credentials for basic authentication (Using OAuth2 Password Grant) |
3 | Skip SSL certificate verification (Use for DEVELOPMENT ONLY!) |
4 | Configure any Proxy settings |
5 | Add a custom interceptor e.g. to set the OAuth2 Authorization header. This allows you to pass an OAuth2 Access Token instead of username/password credentials. |
6 | Builds the ClientHttpRequestFactory that can be set on the RestTemplate . |
Once the HttpClientConfigurer
is configured, you can use its buildClientHttpRequestFactory
to build the ClientHttpRequestFactory
and then set the corresponding
property on the RestTemplate
. You can then instantiate the actual DataFlowTemplate
using that RestTemplate
.
To configure Basic Authentication, the following setup is required:
RestTemplate restTemplate = DataFlowTemplate.getDefaultDataflowRestTemplate();
HttpClientConfigurer httpClientConfigurer = HttpClientConfigurer.create("http://localhost:9393");
httpClientConfigurer.basicAuthCredentials("my_username", "my_password");
restTemplate.setRequestFactory(httpClientConfigurer.buildClientHttpRequestFactory());
DataFlowTemplate dataFlowTemplate = new DataFlowTemplate("http://localhost:9393", restTemplate);
You can find a sample application as part of the spring-cloud-dataflow-samples repository on GitHub.
Appendix B: “How-to” guides
This section provides answers to some common ‘how do I do that…’ questions that often arise when people use Spring Cloud Data Flow.
If you have a specific problem that we do not cover here, you might want to check out stackoverflow.com to see if someone has already provided an answer.
That is also a great place to ask new questions (use the spring-cloud-dataflow
tag).
We are also more than happy to extend this section. If you want to add a “how-to”, you can send us a pull request.
B.1. Configure Maven Properties
You can set the Maven properties, such as the local Maven repository location, remote Maven repositories, authentication credentials, and proxy server properties through command-line properties when you start the Data Flow server.
Alternatively, you can set the properties by setting the SPRING_APPLICATION_JSON
environment property for the Data Flow server.
The remote Maven repositories need to be configured explicitly if the applications are resolved by using the Maven repository, except for a local
Data Flow server.
The other Data Flow server implementations (which use Maven resources for application artifacts resolution) have no default value for remote repositories.
The local
server has repo.spring.io/libs-snapshot
as the default remote repository.
To pass the properties as command-line options, run the server with a command similar to the following:
$ java -jar <dataflow-server>.jar --maven.localRepository=mylocal
--maven.remote-repositories.repo1.url=https://repo1
--maven.remote-repositories.repo1.auth.username=repo1user
--maven.remote-repositories.repo1.auth.password=repo1pass
--maven.remote-repositories.repo2.url=https://repo2 --maven.proxy.host=proxyhost
--maven.proxy.port=9018 --maven.proxy.auth.username=proxyuser
--maven.proxy.auth.password=proxypass
You can also use the SPRING_APPLICATION_JSON
environment property:
export SPRING_APPLICATION_JSON='{ "maven": { "local-repository": "local","remote-repositories": { "repo1": { "url": "https://repo1", "auth": { "username": "repo1user", "password": "repo1pass" } },
"repo2": { "url": "https://repo2" } }, "proxy": { "host": "proxyhost", "port": 9018, "auth": { "username": "proxyuser", "password": "proxypass" } } } }'
Here is the same content in nicely formatted JSON:
SPRING_APPLICATION_JSON='{
"maven": {
"local-repository": "local",
"remote-repositories": {
"repo1": {
"url": "https://repo1",
"auth": {
"username": "repo1user",
"password": "repo1pass"
}
},
"repo2": {
"url": "https://repo2"
}
},
"proxy": {
"host": "proxyhost",
"port": 9018,
"auth": {
"username": "proxyuser",
"password": "proxypass"
}
}
}
}'
Depending on the Spring Cloud Data Flow server implementation, you may have to pass the environment properties by using the platform specific environment-setting capabilities. For instance, in Cloud Foundry, you would pass them as cf set-env SPRING_APPLICATION_JSON .
|
B.2. Troubleshooting
B.3. Frequently Asked Questions
In this section, we review the frequently asked questions for Spring Cloud Data Flow. See the Frequently Asked Questions section of the microsite for more information.
Appendix C: Building
This appendix describes how to build Spring Cloud Data Flow.
To build the source, you need to install JDK 1.8.
The build uses the Maven wrapper so that you do not have to install a specific version of Maven.
The main build command is as follows:
$ ./mvnw clean install
To speed up the build, you can add -DskipTests
to avoid running the tests.
You can also install Maven (>=3.3.3) yourself and run the mvn command in place of ./mvnw in the examples below.
If you do that, you also might need to add -P spring if your local Maven settings do not contain repository declarations for Spring pre-release artifacts.
|
You might need to increase the amount of memory available to Maven by setting a MAVEN_OPTS environment variable with a value similar to -Xmx512m -XX:MaxPermSize=128m .
We try to cover this in the .mvn configuration, so, if you find you have to do it to make a build succeed, please raise a ticket to get the settings added to source control.
|
C.1. Documentation
There is a full
profile that generates documentation. You can build only the documentation by using the following command:
$ ./mvnw clean package -DskipTests -P full -pl {project-artifactId} -am
C.2. Working with the Code
If you do not have a favorite IDE, we recommend that you use Spring Tools Suite or Eclipse when working with the code. We use the m2eclipse Eclipse plugin for Maven support. Other IDEs and tools generally also work without issue.
C.2.1. Importing into Eclipse with m2eclipse
We recommend the m2eclipe eclipse plugin when working with Eclipse. If you do not already have m2eclipse installed, it is available from the Eclipse marketplace.
Unfortunately, m2e does not yet support Maven 3.3.
Consequently, once the projects are imported into Eclipse, you also need to tell m2eclipse to use the .settings.xml
file for the projects.
If you do not do this, you may see many different errors related to the POMs in the projects.
To do so:
-
Open your Eclipse preferences.
-
Expand the Maven preferences.
-
Select User Settings.
-
In the User Settings field, click Browse and navigate to the Spring Cloud project you imported.
-
Select the
.settings.xml
file in that project. -
Click Apply.
-
Click OK.
Alternatively, you can copy the repository settings from Spring Cloud’s .settings.xml file into your own ~/.m2/settings.xml .
|
C.2.2. Importing into Eclipse without m2eclipse
If you prefer not to use m2eclipse, you can generate Eclipse project metadata by using the following command:
$ ./mvnw eclipse:eclipse
You can import the generated Eclipse projects by selecting Import existing projects from the File menu.