We all know REST services, right? And REST services work with JSON, right? Well, …, not exactly. Web services based on SOAP (Simple Object Access Protocol) inherently use XML as payload of request and response messages. That is because SOAP is defined as an XML based standard. Where the service is elaborately defined in a service contract using WSDL (Web Service Definition Language).
SOAP Webservices normally use HTTP as a transport, although there are examples of SOAP over JMS. Despite of the word Simple in the acronym, SOAP is considered to be complex. At least for mobile devices.
A few weeks ago I got to help a colleague with a problem calling a REST service. It was a multipart/form-data REST service, not a nowadays more common REST/JSON service. But more on that in a later article.
I want to focus on the problem as I got under my nose. The service should send a request with metadata in the form-data parts, and a report file as an attachment. All went well when the file was under 3KB in size. With larger sizes, an HTTP Response code of 405 is returned with a payload as:
The service seems…
Earlier I wrote about how I created a seemless desktop using Vagrant, VirtualBox en MobaXterm.
A few months ago I was busy creating a new box with Oracle Linux, later switching to CentOS and installing several IDE’s in it. And Docker. This lead to my earlier published article about my Fuse Development Environment.
Earlier I wrote about how to consume a SOAP Service in a Fuse/Camel integration using the CXF framework. But how about SOAP Faults?
The processing of a service can result in an error. There are several types of errors, of course. You could fail in calling the service. Because the service is down or unavailable, because of a network outage. But if you look up an order with an invalid id, did the service then fail? I don’t think so: the service did its job perfectly fine, it worked. …
If you’re quick, you can still take advantage of the Free Java 25th anniversary Learning subscription and get certified for only 25$.
You can find the discount here.
Many people complained about not getting access to the student guide and the lab-files. I got involved in that discussion, and I found that the trainer discusses the labs in great extend. In fact, he does exercises for you, where he shows the student guide and copy and pastes the actual code in the editor. So, in fact, he does not write any code himself during the training. It’s all in the…
Last week I wrote about Red Hat Fuse and REST DSL in combination with Open JPA. This was about how to query, insert, update, and delete entities using OpenJPA. But, how about updating an existing entity object? You might poll for entities that conform to a certain condition and do a logical delete by updating a status-attribute in the entity object.
In our example, you create a new order using the REST service. Subsequently, our shipment processor may poll on the Shipments table for orders marked with the shipped status on false. Ater processing the shipment, the status should be…
One of the observations I had about Microservices, was that it seemed to me that we got back to coding services in Java. Although there are several frameworks, I got the idea that it is mostly a throwback to the 3GL. You may have learned through my articles that I have a strong background in ESB and SOA. These technologies support a more declarative way of building services. I started my career in the Oracle world as a 4GL and CASE (Oracle Designer) developer, that also put me on the path of declarative development and Low Code. And although I…
In my previous article, I explained the basic plumbing of setting up a basic OpenShift Pipelines pipeline.
Having OpenShift Pipelines implemented and working with a simple pipeline as the clone-list-pipeline, the next step is adding an actual build/deploy task. There are a few approaches to do so:
As a son of a plumber, this is what I must come up as a title for my first article on OpenShift Pipelines.
OpenShift Pipelines is based on Tekton and is a cloud-native CI/CD tool for the Kubernetes platform. A pipeline is a structure of tasks that, in this case, automates the process of the build, deployment, and related steps of a cloud application.
There are already several tutorials on Tekton and OpenShift Pipelines. To name a few:
In integration scenarios and complex, heterogeneous application architectures track and tracing integration-instances is important.
Red Hat Fuse, based on Apache Camel, supports OpenTracing, which can be connected with Jaeger.
There is a community Jaeger operator for Kubernetes and OpenShift.
Integration Specialist and technical architect at Virtual Sciences | Conclusion