Spring boot cloud contract usage with wiremock for integration tests and UI testing
I want to configure and run Spring cloud contract wire mock server when
I start spring boot server.
Basically I am planning to deploying my Application war file to spring boot project and then test integration tests by using wire mock server. As per spring guide we can start and stop wire mock server in the test classes by annotating with specific configuration but I want to configure and start wire mock server out side of test classes so that I can use to test my application from UI as well by simulating back end rest services with wire mock services.
Below steps I want to do - 1)Create spring boot project and then deploy already available war file to embedded server 2)Spin up wire mock when spring boot server starts up. 3) use wire mock server for UI testing and intergration test cases.
Please help me to understand how to configure wire mock server and run it before spring boot starts up.
First I will add below configuration to deploy exisisting war to embedded tomcat.
@Bean
public EmbeddedServletContainerFactory servletContainerFactory() {
return new TomcatEmbeddedServletContainerFactory() {
@Override
protected TomcatEmbeddedServletContainer getTomcatEmbeddedServletContainer(
Tomcat tomcat) {
// Ensure that the webapps directory exists
new File(tomcat.getServer().getCatalinaBase(), "webapps").mkdirs();
try {
Context context = tomcat.addWebapp("/foo", "/path/to/foo.war");
// Allow the webapp to load classes from your fat jar
context.setParentClassLoader(getClass().getClassLoader());
} catch (ServletException ex) {
throw new IllegalStateException("Failed to add webapp", ex);
}
return super.getTomcatEmbeddedServletContainer(tomcat);
}
};
}
Then I want to configuration Bean to start wire mock server.WireMockServer wireMockServer = new WireMockServer(options().port(8089)); //No-args constructor will start on port 8080, no HTTPS wireMockServer.start();
Problem here I am not able to understand where should Instance this wiremockserver so that it will be available for both application UI testing and Integration tests
See also questions close to this topic
-
Returning values from @Bean methods
What happens to the object I return from an @Bean method? Is there anyway to retrieve this object from another class?
-
Having Problem With Click Listener in Java
I'm making a chess-like game in java and I'm having an issue with the click events. The
mouseClicked
function isn't responding to my clicks on the window and for no apparent reason.I have already tried a few things such as changing class names and using different functions but nothing has worked.
package main.game.com; import java.awt.event.MouseAdapter; import java.awt.event.MouseEvent; public class ClickEvent extends MouseAdapter { public void mouseClicked(MouseEvent e) { System.out.println("hello"); } }
package main.game.com; import java.awt.Canvas; public class Main extends Canvas { private static final long serialVersionUID = 1673528055664762143L; private static final int WIDTH = 416, HEIGHT = 439; public Main() { Window window = new Window(WIDTH, HEIGHT, "DARRAGH", this); this.addMouseListener(new ClickEvent()); }
package main.game.com; import java.awt.Canvas; import java.awt.Dimension; import javax.swing.JFrame; public class Window extends Canvas { private static final long serialVersionUID = 6733885629776844621L; public Window(int width, int height, String title, Main main) { JFrame frame = new JFrame(title); frame.setPreferredSize(new Dimension(width, height)); frame.setMaximumSize(new Dimension(width, height)); frame.setMinimumSize(new Dimension(width, height)); frame.setDefaultCloseOperation(JFrame.EXIT_ON_CLOSE); frame.setResizable(false); frame.add(main); frame.setVisible(true); main.start(); } }
The first set of code is my
mouseAdapter
library and the second is the first part of my main class containing theclickListener
. -
JDBC choose NIC to use based on local ip address?
I'm using the PostgreSQL JDBC Driver on a machine that has multiple network cards. Only one of them has a stable connection. Is there anyway to tell JDBC to use that NIC by giving it the IP address of the NIC?
-
Testing a decorator that wraps api routes
I am building a rest api and I was making the routes for the api and stumbled upon a problem. I am using flask restplus to build the api and marshmallow to validate the json sent by the client.
My design: I use a decorator that is used to wrap every api route. this decorator validates the json sent by the client against a marshmallow schema and if the json validates, then the decorator lets the api route run. Otherwise, if the json invalidates when it's checked against the schema, it returns the errors that it got when it invalidated the json back to the client without running the route.
I really like this design as it significantly reduces code repetition and it can automatically validate and invalidate the data posted by the client without me doing pretty much the same thing in each api route -- checking the json sent by the client and then running the route.
My only issue is that I have no clue how to unit test this. I have written tests for the specific marshmallow json schemas to check if they raise the correct validation errors when invalid data is passed to them. However, now I need to test the api routes to check if they return the validation errors raised by the schemas. This seems like a lot of repetition of unit tests because I'm checking for the same errors when testing the schemas and again when I'm testing the api routes/decorator.
Therefore, do you guys have any recommendation of how I should unit test this. Should I test the api routes specifically, the decorator separately, and the schemas separately? Or should I test just the api routes to check they return the correct errors that the schemas raise?
Thanks in advance.
-
Write tests for Visual Studio extenstions
I have written an extension for VS 2017 but I can't seem to find how to write tests for it.
Can someone please refer me to a resource explain how to test the extension?
Thanks
-
Simulate is meant to be run on 1 node = 0 found
Unable to make the following test pass : Using React JS / enzyme and jest
I already asked a similar question and try to apply the same method, but its not going through. Any reason ?? Substitute shallow = mount ? or add a dive() ?
file.test.js -
// jest mock functions (mocks this.props.func) const updateSelectedFormJSON = jest.fn(); const closeModal = jest.fn(); const onClick = jest.fn(); const onSaveClick = jest.fn(); // defining this.props const baseProps = { selectedFormJSON :{ FORM_COLUMN:[], }, updateSelectedFormJSON, closeModal, onClick, onSaveClick, describe('SpecifyBodyModal Test', () => { let wrapper; let tree; beforeEach(() => wrapper = mount(<SpecifyBodyModal {...baseProps} />)); it('should call closeModal functions on button click', () => { baseProps.closeModal.mockClear(); wrapper.setProps({ updateSelectedFormJSON :null }); wrapper.find('.add-custom-field-close').at(0).simulate('click') expect(baseProps.closeModal).toHaveBeenCalled(); });
the 2nd test is not passing: error Method “simulate” is meant to be run on 1 node. 0 found instead.
it('should call onSaveClick functions on button click', () => { baseProps.onSaveClick.mockClear(); wrapper.setProps({ closeModal :null }); wrapper.find('.tran-button specify-body-continue').at(1).simulate('click') expect(baseProps.onSaveClick).toHaveBeenCalled();
here is the render file js.
onSaveClick = () => { let json = Object.assign({}, this.props.selectedFormJSON); for(let i in json.FORM_COLUMN) { json.FORM_COLUMN[i].IncludeInBody = this.state[json.FORM_COLUMN[i].Name]; } this.props.updateSelectedFormJSON(json); this.props.closeModal(); render() { return ( <div className='specify-grid-modal'> <div className='fullmodal'> <div className='fullmodal_title'>Specify Body</div> <div title="Close Window" className="add-custom-field-close" onClick={() => this.props.closeModal()}><FontAwesome name='xbutton' className='fa-times preview-close' /></div> </div> <button className='tran-button specify-body-continue' onClick={() => {this.onSaveClick()}} > Continue </button> <div className='specify-body-wrapper'> {this.renderColumns()} </div> </div> )
}
-
How to process multi-line string messages with @ServiceActivator in Spring-Boot
I'm building a spring-boot app to process messages from logging devices sent through TCP connection. I found an elegant solution to process single-line messages, however I'm struggling to get it to work with 3-line messages sent by the device.
I tried using @Aggregator, @ReleaseStrategy, and @AggregationStrategy to accumulate messages coming from the same ip, but @ServiceActivator still gets single lines.
@Aggregator public List<String> aggregatingMethod(Message<?> message) { List<String> result = new ArrayList<>(); String line = message.getPayload().toString(); result.add(line); return result; } @ReleaseStrategy public boolean releaseChecker(List<String> messages) { return messages.size() == 3; } @CorrelationStrategy public String correlateBy(Message<?> message) { return message.getHeaders().get("ip_address").toString(); } @ServiceActivator(inputChannel = "serviceChannel") public void service(List<String> in) { System.out.println("Message"); for(String line: in){ System.out.println(line.toString()); } String clientData = in.get(0) + in.get(1); if (clientData != null && !clientData.isEmpty() && clientData.length() > 30 && clientData.charAt(0) == '#') { // send the message to RabbitMQ log.info("Received <" + clientData + ">"); messageService.sendMessage(clientData); log.info("Added to the queue " + SpringBootRabbitMQApplication.ELD_MESSAGE_QUEUE); } else { log.error("Message is null or in wrong format format: " + clientData); // reader.close(); } }
This is the example I found for single line messages package com.example;
import java.net.Socket; import javax.net.SocketFactory; import org.springframework.boot.SpringApplication; import org.springframework.boot.autoconfigure.SpringBootApplication; import org.springframework.context.ConfigurableApplicationContext; import org.springframework.context.annotation.Bean; import org.springframework.integration.annotation.ServiceActivator; import org.springframework.integration.annotation.Transformer; import org.springframework.integration.channel.DirectChannel; import org.springframework.integration.ip.tcp.TcpReceivingChannelAdapter; import org.springframework.integration.ip.tcp.connection.AbstractServerConnectionFactory; import org.springframework.integration.ip.tcp.connection.TcpNetServerConnectionFactory; import org.springframework.integration.transformer.ObjectToStringTransformer; import org.springframework.messaging.MessageChannel; @SpringBootApplication public class So39290834Application { public static void main(String[] args) throws Exception { ConfigurableApplicationContext context = SpringApplication.run(So39290834Application.class, args); Socket socket = SocketFactory.getDefault().createSocket("localhost", 9999); socket.getOutputStream().write("foo\r\n".getBytes()); socket.close(); Thread.sleep(1000); context.close(); } @Bean public TcpNetServerConnectionFactory cf() { return new TcpNetServerConnectionFactory(9999); } @Bean public TcpReceivingChannelAdapter inbound(AbstractServerConnectionFactory cf) { TcpReceivingChannelAdapter adapter = new TcpReceivingChannelAdapter(); adapter.setConnectionFactory(cf); adapter.setOutputChannel(tcpIn()); return adapter; } @Bean public MessageChannel tcpIn() { return new DirectChannel(); } @Transformer(inputChannel = "tcpIn", outputChannel = "serviceChannel") @Bean public ObjectToStringTransformer transformer() { return new ObjectToStringTransformer(); } @ServiceActivator(inputChannel = "serviceChannel") public void service(String in) { System.out.println(in); }
}
I'm now using ThreadPoolExecutor and process tcp connection in the old-fashioned way:
public static void main(String[] args) throws IOException { SpringApplication.run(SpringBootRabbitMQApplication.class, args); // create the socket server object server = new ServerSocket(port); while (true) { try { // socket object to receive incoming client requests final Socket s = server.accept(); taskExecutor.execute(new Runnable(){ @Override public void run() { boolean messageIsValid = true; while (messageIsValid) { try { // obtaining input and out streams BufferedReader reader = new BufferedReader(new InputStreamReader(s.getInputStream())); String clientData = ""; // read message string clientData = reader.readLine(); clientData += reader.readLine(); reader.readLine(); log.info("Read message from port: " + clientData); if (clientData != null && !clientData.isEmpty() && clientData.length() > 30 && clientData.charAt(0) == '#') { // send the message to RabbitMQ messageService.sendMessage(clientData); } else { log.error("Message is null or in wrong format format: " + clientData + "; Client address: " + s.getInetAddress()); messageIsValid = false; } } catch (IOException e) { log.error("Couldn't read the message from socket"); e.printStackTrace(); messageIsValid = false; } } } }); } catch (Exception e) { e.printStackTrace(); } } }
And it works as expected, but I'd like to get it to work using @ServiceActivator.
-
SpringBoot 2 UnsatisfiedDependencyException on mvcContentNegotiationManager
I am using springboot 2.0.4 with spring cloud dependencies Finchley.RELEASE. When I try to use @EnableOAuth2Sso in my WebSecurityConfigurerAdapter class, I am getting the below exception.
Error creating bean with name 'org.springframework.boot.autoconfigure.security.oauth2.client.OAuth2SsoDefaultConfiguration': Unsatisfied dependency expressed through method 'setContentNegotationStrategy' parameter 0; nested exception is org.springframework.beans.factory.UnsatisfiedDependencyException: Error creating bean with name 'org.springframework.boot.autoconfigure.web.servlet.WebMvcAutoConfiguration$EnableWebMvcConfiguration': Unsatisfied dependency expressed through method 'setConfigurers' parameter 0; nested exception is org.springframework.beans.factory.UnsatisfiedDependencyException: Error creating bean with name 'com.abc.SecurityConfiguration': Unsatisfied dependency expressed through method 'setContentNegotationStrategy' parameter 0; nested exception is org.springframework.beans.factory.BeanCurrentlyInCreationException: Error creating bean with name 'mvcContentNegotiationManager': Requested bean is currently in creation: Is there an unresolvable circular reference?
Please help me in resolving this issue. I don't have any explicit spring-security or spring web references in my pom.xml
-
How does Wiremock determine which TLS version to use?
I have a client which sends handshake request to wiremock by specifying TLS 1.2 but wiremock replies back with TLS version 1.0. How Do I get wiremock to use version 1.2
-
Wiremock/Elasticsearch migrating from Java8 to Java9 / Java11
I am currently migrating app from openJDK 8 to openJDK 9/11. Project uses Maven 3.5.2 (also tried with 3.6). I am using Ubuntu 18.04 as primary OS. My IDE is IntelliJ IDEA 2018.3.2
After adding
module-info.java
most of the maven dependencies worked fine withrequires ${module.name}
I have problems with elasticsearch and wiremock. By requiring them I am getting "module not found"
In
pom.xml
they included as follows:<dependency> <groupId>com.github.tomakehurst</groupId> <artifactId>wiremock</artifactId> <version>1.58</version> <exclusions> <exclusion> <groupId>org.mortbay.jetty</groupId> <artifactId>servlet-api</artifactId> </exclusion> <exclusion> <groupId>org.mortbay.jetty</groupId> <artifactId>jetty</artifactId> </exclusion> </exclusions> <classifier>standalone</classifier> </dependency> <dependency> <groupId>org.elasticsearch</groupId> <artifactId>elasticsearch</artifactId> <version>1.3.2</version> </dependency>
By requiring
wiremock
inmodule-info.java
module com.mydomain { requires guava; ... ... requires wiremock; requires elasticsearch; exports com.mydomain.p1; exports com.mydomain.p2; }
With
wiremock
version<version>1.58</version>
in pom.xml during compiling I am getting[WARNING] Can't extract module name from mysql-connector-mxj-5.0.12.jar: TestDb.class found in top-level directory (unnamed package not allowed in module) [WARNING] Can't extract module name from wiremock-1.58-standalone.jar: Provider class com.fasterxml.jackson.core.JsonFactory not in module [WARNING] Can't extract module name from ical4j-3.0.1.jar: Provider class moduleName = ical4j-module not in module [WARNING] Can't extract module name from elasticsearch-1.3.2.jar: Provider class com.fasterxml.jackson.core.JsonFactory not in module [WARNING] ******************************************************************************************************************** [WARNING] * Required filename-based automodules detected. Please don't publish this project to a public artifact repository! * [WARNING] ******************************************************************************************************************** [ERROR] Failed to execute goal org.apache.maven.plugins:maven-compiler-plugin:3.8.0:compile (default-compile) on project project: Compilation failure: Compilation failure: [ERROR] /home/artem/Documents/project/src/main/java/module-info.java:[95,14] module not found: wiremock [ERROR] /home/artem/Documents/project/src/main/java/module-info.java:[97,14] module not found: elasticsearch
Updating to new versions do not help. If using wiremock with
<version>2.1.6</version>
or higher Intellij saysmodule not found
to:requires commons.lang; requires json;
And
[ERROR] Failed to execute goal on project project: Could not resolve dependencies for project com.mydomain:project:jar:0.0.1-SNAPSHOT: Failure to find com.github.tomakehurst:wiremock:jar:standalone:2.21.0 in https://repository.mulesoft.org/nexus/content/repositories/public was cached in the local repository, resolution will not be reattempted until the update interval of codehaus-release-repo has elapsed or updates are forced
-
Wiremock vs Mockito?
I'm trying to choose between these two tools to mock out our API calls. Is there any major advantages or disadvantages that one has over the other?
Thanks in advance.
-
Generated test sources not getting invoked as part of the build
Having:
- added Spring Cloud Contract plugin and configured it for
baseClassMappings
as well asbasePackageForTests
- added Spring Cloud Contract verifier
- added required base classes for generated tests
- and defined
groovy
contract
successfully generates test sources (which pass when run directly e.g. from within the IDE).
But...
These generated test sources do not get invoked as part of the build e.g.
./gradlew clean build
meaning I do not know if I have broken the contract until I run generated tests manually e.g. from the IDE.Have I missed a step?
Producer project: https://github.com/bilalwahla/cdc
- added Spring Cloud Contract plugin and configured it for
-
Spring Cloud Contract with collection in request
I'm preparing contracts for external service we depend on.
One of fields in request is a collection, its size can vary.
In simplest case I have a collection of strings.How do I define request so it allows to match each element in that collection against given regex, and at the same time allows me to test real API with values I specify?
I imagine it would look something like this:
request { method 'POST' url "/abc" body([ "someToken: $( consumer(regex('[a-zA-Z0-9=]+')), producer("asdgwrg92jgwd0vA") ), "someCollection": [$( consumer(regex('[A-Z]{5}')), producer('"ASDFF","ASDGG"') ] ]) headers { contentType(applicationJson()) } }
I can only test real service with such request body:
{ "someToken":"asdgwrg92jgwd0vA", "someCollection":[ "ASDFF","ASDGG" ] }
But my service is able to produce such:
{ "someToken":"adgwrgsh", "someCollection":[ "ASDFG" ] }
Situation get's more complicated with collection of objects...
I looked for answer in official docs, and there were some possibilities described, but I wasn't able to find them in samples :/
-
Using Spring Cloud Contract Stub Runner with manually created stubs
I'm not sure whether using spring-cloud-contract for the purpose which I mentioned in the title is right or not but I want to understand the use case of it in such scenario.
We are updating our integration tests to use Spring Cloud Contract. For the new features, we are following the documentation for creating stubs from YAML/Groovy and then using them in consumer side with spring-cloud-starter-contract-stub-runner.
My concern is the stubs which were already written and being used. We don't want to spend time on re-writing them in Groovy/YAML to make Spring Cloud Contract to generate stubs as we already have the stubs. Existing test configuration is as follows:
@SpringBootTest( webEnvironment = SpringBootTest.WebEnvironment.RANDOM_PORT, classes = Application.class ) @AutoConfigureWireMock(port = 8989, stubs = "classpath*:**/stubs/mappings/**/*.json", files = "classpath*:**/stubs") class MyClientIT { ... }
When it comes to changing this test to use spring-cloud-contract, I'm stuck with how to configure stub-runner:
@SpringBootTest( webEnvironment = SpringBootTest.WebEnvironment.RANDOM_PORT, classes = Application.class ) @AutoConfigureMockMvc @AutoConfigureStubRunner(???) public class MyNewClientIT {...}
Is it possible and advised to use Spring Cloud Contract for such a case? And if so, how can we make stub-runner to see locally stored (not stored as an artefact locally or remotely) stubs?
Thank you...