From fulfilling the slogan The Network is the Computer, to the Java EE standard, to supporting both SOAP and RESTful Web Services, to creating a strong ecosystem and communities supporting microservices and cloud-native applications that can’t do without an emphasis on APIs. Intertwined with the Internet since its inception, Java supports existing as well as brings in its own vendor-neutral standards and de facto standards reflecting current challenges, such as MicroProfile or supporting integration with AI.
Java turns 30 this year and no one can count how many applications have been created with Java over the three decades. But those applications rarely exist in isolation!
I always try to remind people working on development projects about the often overlooked importance of remote APIs and integration aspects that are vital to connect pieces of software together. I also often stress that people on a project communicating and working together well are more important than solving most technical challenges. In this article I want to show the great role that Java has been playing in connecting software systems and integrating different groups of people in the software industry.
Java was created at Sun Microsystems whose slogan was “The Network is the Computer”. At the time Java was born, people started to realise that not just any network would do. The Internet was what would shape the future of computing.
Already the first release of Java offered developers easy built-in support for many features crucial in the Internet era: the HTTP protocol, texts in any language encoded in Unicode, efficient concurrency thanks to threads, rendering web graphics formats like JPEG, GIF or PNG etc.
At first, Java aimed at embedded computing. The platform-independent JVM and Java API were defined so that we can write Java code once and run it anywhere (on any device). However, what most early adopters of Java actually got to play with were applets: programs running inside a web browser, using the Internet as a distribution channel to download the bytecode that could run on any physical CPU architecture and any operating system. In the context of JavaScript of that time being incompatible across browsers or not available at all, Java applets pioneered the interactive Internet experience we take for granted today.
The heart of the Web is HTTP, a client-server protocol. It soon became apparent that a very good place for Java is the server side. The second major version of Java came with the Java (2), Enterprise Edition specification. Java EE allowed Java to excel on the server side by abstracting the complexity of a server handling many client requests to the Java EE container. “Plugging in” to the Java EE APIs, for example the Servlet API, the application developers could focus on implementing their business logic.
Java is not the only platform supporting the concept of an application server. What makes it unique is the number of diverse commercial vendors and open source projects that have implemented the Java EE specification. The vendor neutrality continues with the Jakarta EE standard managed by the Eclipse Foundation. Developers familiar with the standard can switch from one container implementation to another one. While enabling competition and innovation, the standard keeps the Java community connected.
However, vendor-neutral standards were not always in time to provide developers what they needed to build and integrate their applications effectively. Early versions of Java EE were infamous for not being nice to use and requiring a lot of boilerplate code. Spring Framework addressed the issue by providing its own abstractions on top of Java standard APIs, thereby winning the hearts of lots of enterprise Java developers. But even when you use Spring for your server-side application, you always also depend on Java EE. Many of the abstractions provided by Spring inspired similar features in newer Java/Jakarta EE specifications.
Besides integrating remote APIs, our applications need to be integrated internally, connecting multiple modules and layers. Spring Framework brought a dependency injection framework to avoid coupling between the modules. Now we have dependency injection in Java EE with the CDI standard.
In addition to synchronous communication using the ubiquitous HTTP and easily available thanks to the Servlet API, Java EE also recognised the importance of asynchronous communication via message brokers by defining the JMS standard. By removing the direct links between the communicating parties, the message-oriented integration decreases coupling and improves flexibility and scalability of systems. Thanks to the standard API, the same Java code can work with different messaging server implementations.
We are also experiencing an increase in popularity of non-JMS asynchronous communication platforms like Kafka or various cloud-based messaging services. With Java you can use their respective native clients directly, but there are also options that let you abstract the details of the messaging platform and use them as just “dumb pipes” in solutions based on Enterprise Integration Patterns. Apache Camel offers a mature Domain-Specific Language and Spring Cloud Stream a modern functional programming style.
The success of the Web and its support by different devices, operating systems and applications brought the idea that the infrastructure of the Web could be used for more than just HTML pages. The term Web Services was adopted for the use of the web protocols and infrastructure to connect applications in general. The first Web Services standard (also colloquially known as “SOAP Web Services”), created by the World Wide Web Consortium (W3C), enabled unprecedented interoperability of APIs implemented by diverse technology stacks. In particular, it made integrating Java-based applications with Microsoft-specific implementations much easier.
Due to the diversity of the Java community, multiple libraries and frameworks for implementing the W3C Web Services standard started to appear: Apache Axis, CXF or Metro. A declarative, annotation-based approach to Web services was standardised later by JAX-WS.
The W3C Web Services turned out to be more complex than necessary, duplicating features already available in HTTP itself. Many developers and architects started to prefer RESTful Web Services and exchanged XML for the simpler JSON as the format for their data.
REST is a very popular API style and there are different Java frameworks that you can implement it with: various implementations of JAX-RS (now Jakarta RESTful Web Services) or Spring Web. All of them use annotations to specify the API endpoint paths, methods, parameters etc. declaratively and automatically serialize/deserialize JSON payloads to/from Java models.
Java frameworks make providing and consuming web services simpler and faster, however if an API is to be supported for a longer time and/or for many clients, it pays off to maintain the API specification as a document that is clearly separated from the implementation, using a widely accepted specification language. The W3C (SOAP) Web Services standard includes the WSDL format to specify the APIs. For RESTful Web Services the most commonly used specification language evolved from Swagger to OpenAPI.
The specification-first approach to building APIs means starting with a high-quality OpenAPI document. The OpenAPI Generator is a mature open-source project written in Java that can generate Java interfaces and models from OpenAPI. Using Maven and Gradle plugins it can be integrated in the build, so that the Java code is always in sync with the API specification. The generator can also generate code in many other languages, so using OpenAPI is a great way to integrate with non-Java systems.
If you prefer the code-first approach, all common Java application frameworks support generating OpenAPI documents and a Swagger web-based UI from the annotated Java code.
No matter if your APIs are specification-first or code-first, for multiple reasons it is recommended to keep them in a layer separate from the core domain logic of your application.
This also means that each of the layers should use its own objects for holding application data. Annotation processing is here to the rescue again, we can map data transferred between the layers declaratively using MapStruct.
We use a special type of Java objects, the so-called Data Transfer Objects (DTOs) to model the API payloads as well as the data passed to the domain layers of the application. They are not typical objects as defined by the Object-Oriented Programming, they usually do not contain methods other than those needed to build the object (including validation) and read their attributes. Working with this type of objects aligns well with the increasing popularity of Data-Oriented Programming (a subset of Functional Programming). To make the data objects safe for streams and concurrent processing, it is recommended to make them immutable. Java reacted to this trend by providing the record type that is immutable and reduces the boilerplate code needed to define a DTO.
As the volume of traffic and scalability requirements for many APIs grew, the classical thread-per-request programming model supported by the Servlet API (and other server-side APIs) started to be perceived as a bottleneck, mostly because integration-related code spends a significant proportion of its processing time waiting for I/O. At first many Java projects decided to tackle the problem by switching to the reactive programming paradigm, building applications that process data using reactive streams implemented by RxJava or Project Reactor. The different reactive streams implementations share the same standard interfaces that became part of the Java language.
However, for most applications the additional complexity of reactive programming was not justified by the mere need to avoid blocking threads. Virtual threads coming from Project Loom can address the problem in a more elegant way. Applications can continue using the thread-per-request coding model (that is simple to read and debug), while Java standard libraries and the JVM take care of reusing the limited number of platform threads (threads available at the operating system level) so that using a million (virtual) threads, many of which are blocked, is no longer a problem.
Here we can see the pattern that a common problem occurring in the software industry first triggers solutions based on new libraries and frameworks that are viable and adopted quickly thanks to the flexibility and type-safety of the Java language. Later the multiple solutions are standardised to make the implementations interoperable. Even later the implementations can be simplified by new features coming to the Java language and JVM. Ultimately the whole Java community can reuse the common problem solution in different contexts easily.
Scalability, frequent deployments and reducing dependencies among development teams are motivation factors behind the growing attactivity of distributed applications based on the Service-Oriented Architecture, later renamed to Microservices. It is no surprise that good integration of the services using good API design and implementation are very important in such systems. Deployment and operation of the many separate services shifted from application servers (JEE containers) to Docker containers and cloud deployments.
Stand-alone services based on an executable JAR instead of being deployed to a separate JEE container were introduced by Spring Boot. Later, to achieve radical improvements in start-up time and computing resource efficiency it was necessary to change as much as possible of the process of finding application components, their configuration and interconnection, from dynamic processing during the application start-up to evaluation at build time. That idea led to the creation of frameworks like Micronaut and Quarkus.
The new frameworks tried to make it easy for developers to switch to them by reusing concepts from either the existing de facto Spring standard (Micronaut using controllers) or from Java/Jakarta EE. Not everything from Jakarta EE was practical to support in the microservices frameworks and on the other hand there were new aspects of the services that can benefit from standardisation to avoid fragmenting the Java community. Also maintained under the umbrella of the Eclipse Foundation we now have the MicroProfile specification, the standard that shares the Core profile with Jakarta EE and is implemented by Quarkus, Helidon and others.
MicroProfile contains many parts that are relevant to integration:
- Jakarta RESTful Web Services, JSON Binding and JSON Processing (already mentioned above)
- Jakarta OpenAPI to generate OpenAPI from Java code
- Jakarta REST Client to consume RESTful web services
- Jakarta Fault Tolerance with retries, circuit breakers etc.
- Jakarta JWT Authentication to handle security aspects by specialised OAuth servers
- Jakarta Telemetry to monitor our services and APIs well
- Jakarta Config and Health so that the services can be configured and orchestrated in real environments
When talking about integration, we should not forget testing. Of course, with Java we have the great JUnit framework to automate our tests. But the layers of our applications handling integrations and remote APIs have their specifics. The critical parts often don’t take the form of lines of our Java code whose test coverage is measurable as a percentage. Integration bugs are usually hidden in configurations, authentication, de-/serialisation etc. So it is not enough to focus on just a couple of classes, simulating the rest by mocks.
We need to involve the infrastructure provided by the application framework to run meaningful integration tests. Most frameworks include their own integration testing support, unfortunately this area is not unified across the frameworks by a strong standard like Java/Jakarta EE or MicroProfile. But you can run your application including any framework as a black box and let the integration tests communicate with it using the real APIs over the real network protocols. In Java you can make this kind of tests using the nice DSL provided by the RestAssured library. To simulate APIs out of the test there is another great Java-based tool, WireMock.
The hot topic of today is the business usage of AI and especially LLMs. The landscape in this area is changing very rapidly and clearly both the software structure and hardware requirements of AI systems are very different from traditional enterprise applications. For these reasons it is likely that for many businesses it makes sense to integrate existing systems with LLM or RAG services via remote APIs. Although there are many fast-evolving and competing AI products, Java already offers adapters to abstract away the differences in their APIs, let’s just mention Langchain4j and Spring AI.
I believe that one of the factors that has allowed Java to be a successful language and platform for 30 years is its ability to connect both software systems and developer communities. Projects that relied on Java were able to benefit from efficiencies in both the development process and the execution of delivered applications. Developers can expect Java to stay relevant in the future.
However, good API designs and integration solutions are not automatic even when using Java. Moreover, mistakes are not easy to fix once APIs are in use. API development skills increase with focused training and experience. Applying these skills requires that the company and the project recognise the importance of integration and give it appropriate attention.