Introducing Akka Cloud to Edge Continuum. Build once for the Cloud. Seamlessly deploy to the Edge - Read Blog
Support
scala webinar

Webinar & QA: Welcome to Scala 2.11

Thanks everyone for joining us on yesterday’s webinar, “Welcome to Scala 2.11!”

For those who missed it, Jason Zaugg, compiler engineer at Typesafe, presented how the Scala team worked over the past year to make the platform faster, smaller and more stable. Jason discussed efforts to improve performance for the incremental and batch compilers and collections library, covered the decision to split the formerly monolithic standard library into modules, touched upon efforts to refine experimental language features, and much more.

Watch the video here:

 

You can also flip through the slides for access to links in Jason’s presentation.

Below are some questions and answers from the Q&A session that we thought you might find interesting. Feel free to submit more, we look forward to hearing from you and hope to see you on future webinars!


Question: Will 2.12 require Java 8?

Response: We don't know that yet. The answer will depend on two things: How is the pace of adoption of Java 8 going in industry? How compelling are the benefits for Scala users after we mandate Java 8? We've reached out to industry in our Java 8 survey (https://typesafe.com/blog/java-8-survey-results) to help answer the first question. Our technical work in the next 12 months will help answer the second. We'll have to make a call on this around Q3. My gut feeling is that we’ll have to wait one more major release.

 

Q. When will Scala 2.11.1 be released? 2.12?

R: We keep our plans up to date in our release roadmap. Our work with the Community build and release automation have made the process of cutting a release easier, we would like to aim for more frequent maintenance releases (2-3 months). We hope to release Scala 2.11.1 ASAP to deliver a fix for a regression in serialization that was reported just after the release of 2.11.0.

We are still scoping out the work for 2.12, but we expect to target a release Q2/Q3 2015. Our efforts to lower the cost of upgrading Scala versions might influence our timing here; if we get feedback that a slightly slower major release cycle would help the community, we'll factor that in.

 

Q: Why isn't the new optimizer already merged into the new backend (-Ybackend:GenBCode)?

R: As some background, Scala 2.11.0 includes a new bytecode emitter, “GenBCode”, which is enabled by a compiler option. This is the results of the excellent work of Miguel Garcia. This is an important step for us to improve the speed of the compiler and to fix some long standing limitations and bugs in `scalac -optimize`.

We had hoped to bring in the new backend and optimizer in 2.11.0, but we ran out of time. Due to our binary compatibility constraints, it is important for major versions, we had to focus on API changes to the library, and to changes to the compiler that may influence the signatures in generated code (our ABI). In minor releases, we can then work on bugs, or on extensions that are only enabled for early adopters with compiler flags, such as the new backend. We plan to bring in the most important parts of the new optimizer by 2.11.2.

One important step we’ve just taken is to setup a variation of our new Community Build that enabled the new backend in the build of 1M loc of open source projects. We discovered two bugs, both of which will be fixed in the imminent minor release, 2.11.1. As we incorporate the new optimizer, this build will help us to establish a high degree of confidence that it generates correct bytecode.

 

Q: Where are macro annotations?

R: def Macros have been pretty popular in Scala 2.11 for certain domains: DSLs like scala-async or SBT 0.13, high perfomance code rewriting tools like scala-blitz or breeze, boilerplate reduction in serialization libraries like Play JSON orscala-pickling.

But, we are really mindful of the "unfinished" parts of macros.

 - Macro authors need to have a pretty deep understanding of compiler internals to make macros that work for all inputs

 - The APIs they deal with are fairly enormous, and tightly coupled to compiler internals. This makes us reluctant to offer the usual guarantees for source compatibility between major versions

 - Macro flavours that compute return types, or introduce members, are a blind spot for IDEs that don't execute the macro (IntellIJ, or even in the new backwards compatibility mode of Scala IDE)

Macro annotations are completely prone to the last problem. We've asked the research team at EPFL to look into ways to improve on the status quo. this effort is being led by Eugene Burmako under Project Palladium.

 

Q. You say you want to offer smoother upgrades? Does that mean that 2.12 will be binary compatible with 2.11?

R. Binary compatibility is one pillar of smooth upgrades. But there are some important differences between Java and Scala that mean we're not in a position to make that guarantee just yet.

Think of traits. When you add a new method body to a trait, you need to recompile the implementing classes to create the forwarder to that code. But when making the change, it sort of feels like you shouldn't need to do that. In Java (6), the restriction is also in place: if you add a new method to an interface, and don't implement that in subclasses, you get linkage errors at runtime. So the authors of the Java standard library have been constrained to *never* add a method to an interface after it has been released.

In Java 8, they have actually outgrown this restriction and added "default" methods into the JVM spec to give you something like Scala traits (only methods, mind you, no fields). We're hopeful that we can back Scala traits (or a subset thereof) with this new facility.

Now, because the Java library authors new from the outset that they basically couldn't touch an API after it was released, one can only assume that they had to spend a lot more time on review and testing before releasing. Even then things haven't always gone well; we're still stuck with java.util.Date!

We don't have the same resources as Sun or Oracle when it comes to this, so how do we get there? Firstly, our efforts in modularization are seeking to offer a slimmer core standard library, where it becomes more tenable to offer stronger guarantees. We might need to deprecate or split out a few more things first.

We also know that if new versions of your dependency graph is available, you no longer need BC Scala releases. Thats a motivating factor behind the community build. We might even be able to get to a stage where libraries can go into "escrow" in a community build, and a new version will be automatically built with the new Scala version. An alternative approach is a tool that upgrades an old library for a new Scala version. This still represents a non-trivial effort on our part, so it might need to be funded through commercial support.

 

The Total Economic Impact™
Of Lightbend Akka

  • 139% ROI
  • 50% to 75% faster time-to-market
  • 20x increase in developer throughput
  • <6 months Akka pays for itself