Introducing Akka Cloud to Edge Continuum. Build once for the Cloud. Seamlessly deploy to the Edge - Read Blog
Support

JVM developers moving to containers and microservices to keep up with fast data

[Note: this article by Matt Asay originally appeared on TechRepublic.com]

Developers have never had it so good, swimming as they do in a sea of cheap, flexible cloud hardware and open source software. However, as a newLightbend survey of 2,100 JVM developers suggests, there has never been a more precarious time for Java Virtual Machine (JVM) developers, as the traditional Java EE app server may be gasping its last breath. Perhaps this isn't surprising, given the fact that machine learning and microservices are completely changing how we program.

Yet, it's disconcerting for the thousands of engineers who have built their careers on what appears to be a dying art. As the survey report concludes, "The old world where JVM language developers relied on operators to do the work around deploying applications is in the midst of major upheaval, as the entire Java EE stack built around heavyweight app servers is losing relevance."

So long, and thanks for all the fish?

I spoke to Lightbend CEO Mark Brewer to get more background on the survey results and learn more about how JVM developers are grappling with modern data realities. He said the challenge stems less from the volume of so-called "big data" and more from meeting new requirements in speed and performance, which he calls the new era of "Fast Data."

This fast data world threatens to completely upend the traditional Java app server, as well as the ops teams and developers who love them.

Java developers and 'Fast Data'

Though a new slant on big data isn't really necessary—Gartner's "three Vs" of big data already incorporates velocity (in addition to volume and variety of data)—Brewer can be excused for fixating on speed. After all, as he stressed, this is the first time that "any application can take advantage of data not even written to disk—as it's still moving from its source to the application or database."

SEE: How one company improved developer productivity by 700% with reactive programming (Lightbend customer MoneySuperMarket)

This means that you don't have to wait to do queries, and can actually process the data while it's still moving. It also means that speed increasingly defines applications.

Machine learning, anomaly detection, analytics—all of these big data use cases put a premium on speed. Nor are they alone. New applications like IoT, mesh devices, home automation, self-driving cars and telemetry data, and many other use cases are reliant on processing data while it's still "in motion," Brewer said.

None of which makes JVM developers' lives any easier.

From a JVM developer standpoint, Brewer told me, this trend has made the applications richer, but also forced developers to be smarter about data. Indeed, "Where batch jobs seldom last for more than a few hours, a streaming pipeline often runs continuously, and this always-on requirement is unprecedented in how developers have their applications interacting with real-time data," Brewer said.

This shift has led Java developers to the principles of reactive systems, where stream-based architectures are designed to be responsive, resilient, elastic, and message-driven from the start, as Brewer pointed out:

Where in the past, developers simply asked for the schema of the data and stored procedures, now they need to know about its structure, its latency, where it's coming from, and whether any processing needs to be done before it comes into the app. And they have an explosion of frameworks to choose from. Not just Apache Spark or Hadoop, but Akka, Akka Streams, Kafka, Gearpump, Flink, and many more. In some ways, it's like the Wild West today as modern enterprise applications are being written and re-written specifically for big data.

The challenge isn't isolated to JVM developers, though. Ops teams are also struggling to keep up.

Operations in an ever-faster world

One of the biggest changes for operations teams is that the old way of deploying applications in 12 to 36-month release cycles does not meet the new requirements for getting modern applications to market. These applications, Brewer noted, are being increasingly created in an agile methodology, within continuous integration or continuous deployment pipelines.

Not surprisingly, old-school app servers are feeling the heat.

In fact, Brewer said, JVM developers—primarily Scala and Java developers—are moving off of traditional Java application servers that were designed to run on dedicated servers, and moving instead to containers (44% run containers in production or are in serious pilots to do so) and microservices (30% are running them in production and another 20% in "serious pilots") to get more agility out of the infrastructure to support these new types of applications with faster release cycles to get them in production. With 34% of survey respondents saying most of their data processing today is real-time, this isn't a nice-to-have. It's imperative.

SEE How The New York Times uses reactive programming tools like Scala to scale

They're also often being driven to containers and microservices on the basis that fast data frameworks are distributed—meaning, designed to run across multiple machines and in the cloud (31% are already running most of their applications in the cloud and 21% are

in the process of creating a cloud-native strategy), with workload portability.

This has created a re-thinking of the application infrastructure that even Oracle acknowledged recently in the Java Community Process. There, Anil Gaur, Oracle Group vice president with responsibility for Java EE and WebLogic Server, admitted to the JCP that "enterprise programming styles are changing," further suggesting that "more and more applications are distributed in nature and get deployed in cloud environments" and therefore require more modern "Reactive-style programming."

Or, as Brewer paraphrased Oracle's perspective, "The Java Virtual Machine is a fantastic run-time, but we have seen a hiccup in the last few years where the application infrastructure for Java apps did not evolve to meet the demands of distributed computing."

Uh, oh.

In the good old days, infrastructure was left to the operators, but today JVM developers are moving to DevOps to keep up with the shifting needs of enterprise apps and infrastructure. Indeed, 57% of the survey respondents said containers will disrupt the JVM landscape. The other 43% are wrong.

 

Read More