The Object Teams Blog

Adding team spirit to your objects.

Posts Tagged ‘modularity

Jigsaw arriving – and now what?

leave a comment »

The debate is over. The result of fierce discussions has been declared. Jigsaw is knocking at the door. Now the fate of Jigsaw depends on the community. Will the community embrace Jigsaw? Will we be able to leverage the new concepts to our benefit? Will we struggle with new pitfalls?

Let’s step back a second: do we really know what is Jigsaw? Do we really understand the design, its subtleties and implications?

eclipsecon2017At EclipseCon Europe 2017 I will try to shed a light on some lesser known aspects of Jigsaw. A deep dive into things I learned as a JDT committer while implementing tool support for Java 9.

In search for truth

To set the stage, we’ll first have to figure out, who or what defines Java 9 — or Jigsaw — or JPMS. This is both a question of specification vs. implementation as well as a matter of a specification spread over several documents and artifacts. Let’s try to grok Jigsaw from the legally binding sources, rather then from secondary writing and hearsay (if that is possible).

We will have to relearn some of the basic terms, like: what is a package in Java? Do packages form a hierarchy? (I will show, how both answers, yes and no, are equally right and wrong).

Jigsaw is said to do away with some “undesirable” stuff like split packages, and cyclic dependencies. Really? (Yes and no).


Of course, with Jigsaw all is about encapsulation – easy to agree on, but what is it that a Java 9 module encapsulates? Only a deep understanding of the flavor of encapsulation will tell us, what exact qualities we gain from following the new discipline (it’s not about security, e.g.), and also what pains will be incurred on the path of migrating to the new model. (Hint: I will be speaking both about compile time and runtime).

Interestingly, the magic potion Jigsaw also brings its own antidote: With tightening the rules of encapsulation, also the opposite becomes necessary: they call it breaking encapsulation, for which I once coined the term “decapsulation“. I should be flattered by how close Java 9 comes to what I call “gradual encapsulation“. So, the talk can not stop at presenting just the new language concepts, also the various extra-lingual means for tuning your architecture need to be inspected through the looking glass. This is also where tools come into focus: how can JDT empower its users to use the new options without the need to dig through the sometimes cryptic syntax of command line parameters?

Loose ends

At this point we shall agree that Jigsaw is a compromise, fulfilling many of its goals and promises, while also missing some opportunities.

I will also have to say a few words about long standing API of JDT, which makes guarantees that are no longer valid in Java 9. This raises the question: what is the migration path for tools that sit on top of JDT? (There is no free lunch).

Finally, it wouldn’t be Java, if overloading wouldn’t make things worse for the newly added concepts. But: haven’t you become fond of hating, when JDT says:

error_obj The type cannot be resolved. It is indirectly referenced from required .class files

We may be seeing light at the end of the tunnel: for Jigsaw we had to revamp the guts of our compiler in ways, that could possibly help to – finally – resolve that problem. Wish me luck …

Hope to see you in Ludwigsburg, there’s much to be discussed over coffee and beer

eclipsecon2017 Ludwigsburg, Germany · October 24 – 26, 2017


Written by Stephan Herrmann

September 10, 2017 at 16:31

Book chapter published: Confined Roles and Decapsulation in Object Teams — Contradiction or Synergy?

leave a comment »

I strongly believe that for perfect modularity, encapsulation in plain Java is both too weak and too strong. This is the fundamental assumption behind a book chapter that has just been published by Springer.

The book is:
Aliasing in Object-Oriented Programming. Types, Analysis and Verification
My chapter is:
Confined Roles and Decapsulation in Object Teams — Contradiction or Synergy?

The concepts in this chapter relate back to the academic part of my career, but all of my more pragmatic tasks since those days indicate that we are still very far away from optimal modularity, and both mistakes are found in real world software: to permit access too openly and to restrict access too strictly. More often than not, it’s the same software exhibiting both mistakes at once.

For the reader unfamiliar with the notions of alias control etc., let me borrow from the introduction of the book:

Aliasing occurs when two or more references to an object exist within the object graph of a running program. Although aliasing is essential in object-oriented programming as it allows programmers to implement designs involving sharing, it is problematic because its presence makes it difficult to reason about the object at the end of an alias—via an alias, an object’s state can change underfoot.


Aliasing, by the way, is one of the reasons, why analysis for @Nullable fields is a thorny issue. If alias control could be applied to @Nullable fields in Java, much better static analysis would be possible.

How is encapsulation in Java too weak?

Java only allows to protect code, not objects

This manifests at two levels:

Member access across instances

In a previous post I mentioned that the strongest encapsulation in Java – using the private modifier – doesn’t help to protect a person’s wallet against access from any other person. This is a legacy from the pre-object-oriented days of structured programming. In terms of encapsulation, a Java class is a module utterly unaware of the concept of instantiation. This defect is even more frustrating as better precedents (think of Eiffel) have been set before the inception of Java.

A type system that is aware of instances, not just classes, is a prerequisite for any kind of alias control.

Object access meets polymorphism

Assume you declare a class as private because you want to keep a particular thing absolutely local to the current module. Does such a declaration provide sufficient protection? No. That private class may just extend another – public – class or implement a public interface. By using polymorphism (an invisible type widening suffices) an instance of the private class can still be passed around in the entire program – by its public super type. As you can see, applying private at the class level, doesn’t make any objects private, only this particular class is. Since every class extends at least Object there is no way to confine an object to a given module; by widening all objects are always visible to all parts of the program. Put dynamic binding of method calls into the mix, and all kinds of “interesting” effects can be “achieved” on an object, whose class is “secured” by the private keyword.

The type system of OT/J supports instance-based protection.

Java’s deficiencies outlined above are overcome in OT/J by two major concepts:

Dependent types
Any role class is a property of the enclosing team instance. The type system allows reasoning about how a role is owned by this enclosing team instance. (read the spec: 1.2.2)
Confined roles
The possible leak by widening can be prevented by sub-classing a predefined role class Confined which does not extend Object. (read the spec: 7.2)

For details of the type system, why it is suitable for mending the given problems, and why it doesn’t hurt in day-to-day programming, I have to refer you to the book chapter.

How is encapsulation in Java too strict?

If you are a developer with a protective attitude towards “your” code, you will make a lot of things private. Good for you, you’ve created (relatively) well encapsulated software.
But when someone else is trying to make use of “your” code (re-use) in a slightly unanticipated setting (calling for unanticipated adaptations), guess what: s/he’ll curse you for your protective attitude.
Have you ever tried to re-use an existing class and failed, because some **** detail was private and there was simply no way to access or override that particular piece? When you’ve been in this situation before, you’ll know there are basically 2 answers:

  1. Give up, simply don’t use that (overly?) protective class and recode the thing (which more often than not causes a ripple effect: want to copy one method, end up copying 5 or more classes). Object-orientation is strong on re-use, heh?
  2. Use brute force and don’t tell anybody (tools that come in handy are: binary editing the class file, or calling Method.setAccessible(true)). I’m not quite sure why I keep thinking of Core Wars as I write this 🙂 .

OT/J opens doors for negotiation, rather than arms race & battle

The Object Teams solution rests on two pillars:

  1. Give a name to the act of disregarding an encapsulation boundary: decapsulation. Provide the technical means to punch tiny little holes into the armor of a class / an object. Make this explicit so you can talk and reason about the extent of decapsulation. (read the spec: 2.1.2(c))
  2. Separate technical possibility from questions of what is / should / shouln’t be allowed. Don’t let technology dictate rules but make it possible to formulate and enforce such rules on top of the existing technology.

Knowing that this topic is controversial I leave at this for now (a previous discussion in this field can be found here).

Putting it all together

  1. If you want to protect your objects, do so using concepts that are stronger than Java’s “private”.
  2. Using decapsulation where needed fosters effective re-use without the need for “speculative API”, i.e., making things public “just in case”.
  3. Dependent types and confined roles are a pragmatic entry into the realm of programs that can be statically analysed, and strong guarantees be given by such analysis.
  4. Don’t let technology dictate the rules, we need to put the relevant stakeholders back into focus: Module providers, application developers and end users have different views. Technology should just empower each of them to get their job done, and facilitate transparency and analyzability where desired.

Some of this may be surprising, some may sound like a purely academic exercise, but I’m convinced that the above ingredients as supported in OT/J support the development of complex modular systems, with an unprecedented low effective coupling.

FYI, here’re the section headings of my chapter:

  1. Many Faces of Modularity
  2. Confined Roles
    1. Safe Polymorphism
    2. From Confined Types to Confined Roles
    3. Adding Basic Flexibility to Confined Roles
  3. Non-hierarchical Structures
    1. Role Playing
    2. Translation Polymorphism
  4. Separate Worlds, Yet Connected
    1. Layered Designs
    2. Improving Encapsulation by Means of Decapsulation
    3. Zero Reference Roles
    4. Connecting Architecture Levels
  5. Instances and Dynamism
  6. Practical Experience
    1. Initial Comparative Study
    2. Application in Tool Smithing
  7. Related Work
    1. Nesting, Virtual Classes, Dependent Types
    2. Multi-dimensional Separation of Concerns
    3. Modules for Alias Control
  8. Conclusion

Written by Stephan Herrmann

March 30, 2013 at 22:11

Modularity at JavaOne 2012

leave a comment »

Those planning to attend JavaOne 2012 in San Francisco might be busy right now building their personal session schedule. So here’s what I have to offer:

On Wed., Oct. 3, I’ll speak about

Redefining Modularity and Reuse in Variants—
All with Object Teams

I'm Speaking at JavaOne 2012

I guess one of the goals behind moving Jigsaw out of the schedule for Java 8 was: to have more time to improve our understanding of modularity, right? So, why not join me when I discuss how much an object-oriented programming language can help to improve modularity.

When speaking of modularity, I don’t see this as a goal in itself, but as a means for facilitating reuse, and when speaking of reuse I’m particularly interested in reusing a module in a context that was not anticipated by the module provider (that’s what creates new value).

To give you an idea of my agenda, here’re the five main steps of this session:

  1. Warm-up: Debugging the proliferation of module concepts – will there ever be an end? Much of this follows the ideas in this post of mine.
  2. Reuse needs adaptation of existing modules for each particular usage scenario. Inheritance, OTOH, promises to support specialization. How come inheritance is not considered the primary architecture level concept for adapting reusable modules?
  3. Any attempts to apply inheritance as the sole adaptation mechanism in reuse mash-ups results in conflicts which I characterize as the Dominance of the Instantiator. To mend this contention inheritance needs a brother, with similar capabilities but much more dynamic and thus flexible.
  4. Modularity is about strictly enforced boundaries – good. However, the rules imposed by this regime can only deliver their true power if we’re also able to express limited exceptions.
  5. Given that creating variants and adapting existing modules is crucial for unanticipated reuse, how can those adaptations themselves be developed as modules?

At each step I will outline a powerful solution as provided by Object Teams.

See you in San Francisco!

5 items in your shopping cart Edit: The slides and the recording of this presentation are now available online.

Written by Stephan Herrmann

September 20, 2012 at 19:55

Object Teams with Null Annotations

with 5 comments

The recent release of Juno M4 brought an interesting combination: The Object Teams Development Tooling now natively supports annotation-based null analysis for Object Teams (OT/J). How about that? 🙂

The path behind us

Annotation-based null analysis has been added to Eclipse in several stages:

Using OT/J for prototyping
As discussed in this post, OT/J excelled once more in a complex development challenge: it solved the conflict between extremely tight integration and separate development without double maintenance. That part was real fun.
Applying the prototype to numerous platforms
Next I reported that only one binary deployment of the OT/J-based prototype sufficed to upgrade any of 12 different versions of the JDT to support null annotations — looks like a cool product line
Pushing the prototype into the JDT/Core
Next all of the JDT team (Core and UI) invested efforts to make the new feature an integral part of the JDT. Thanks to all for this great collaboration!
Merging the changes into the OTDT
Now, that the new stuff was mixed back into the plain-Java implementation of the JDT, it was no longer applicable to other variants, but the routine merge between JDT/Core HEAD and Object Teams automatically brought it back for us. With the OTDT 2.1 M4, annotation-based null analysis is integral part of the OTDT.

Where we are now

Regarding the JDT, others like Andrey, Deepak and Aysush have beaten me in blogging about the new coolness. It seems the feature even made it to become a top mention of the Eclipse SDK Juno M4. Thanks for spreading the word!

Ah, and thanks to FOSSLC you can now watch my ECE 2011 presentation on this topic.

Two problems of OOP, and their solutions

Now, OT/J with null annotations is indeed an interesting mix, because it solves two inherent problems of object-oriented programming, which couldn’t differ more:

1.: NullPointerException is the most widespread and most embarrassing bug that we produce day after day, again and again. Pushing support for null annotations into the JDT has one major motivation: if you use the JDT but don’t use null annotations you’ll no longer have an excuse. For no good reasons your code will retain these miserable properties:

  • It will throw those embarrassing NPEs.
  • It doesn’t tell the reader about fundamental design decisions: which part of the code is responsible for handling which potential problems?

Why is this problem inherent to OOP? The dangerous operator that causes the exception is this:

right, the tiny little dot. And that happens to be the least dispensable operator in OOP.

2.: Objectivity seems to be a central property on any approach that is based just on Objects. While so many other activities in software engineering are based on the insight that complex problems with many stakeholders involved can best be addressed using perspectives and views etc., OOP forces you to abandon all that: an object is an object is an object. Think of a very simple object: a File. Some part of the application will be interested in the content so it can decode the bytes and do s.t. meaningful with it, another part of the application (maybe an underlying framework) will mainly be interested in the path in the filesystem and how it can be protected against concurrent writing, still other parts don’t care about either but only let you send the thing over the net. By representing the “File” as an object, that object must have all properties that are relevant to any part of the application. It must be openable, lockable and sendable and whatnot. This yields
bloated objects and unnecessary, sometimes daunting dependencies. Inside the object all those different use cases it is involved in can not be separated!

With roles objectivity is replaced by a disciplined form of subjectivity: each part of the application will see the object with exactly those properties it needs, mediated by a specific role. New parts can add new properties to existing objects — but not in the unsafe style of dynamic languages, but strictly typed and checked. What does it mean for practical design challenges? E.g, direct support for feature oriented designs – the direct path to painless product lines etc.

Just like the dot, objectivity seems to be hardcoded into OOP. While null annotations make the dot safe(r), the roles and teams of OT/J add a new dimension to OOP where perspectives can be used directly in the implementation. Maybe it does make sense, to have both capabilities in one language 🙂 although one of them cleans up what should have been sorted out many decades ago while the other opens new doors towards the future of sustainable software designs.

The road ahead

The work on null annotations goes on. What we have in M4 is usable and I can only encourage adopters to start using it right now, but we still have an ambitious goal: eventually, the null analysis shall not only find some NPEs in your program, but eventually the absense of null related errors and warnings shall give the developer the guarantee that this piece of code will never throw NPE at runtime.

What’s missing towards that goal:

  1. Fields: we don’t yet support null annotations for fields. This is next on our plan, but one particular issue will require experimentation and feedback: how do we handle the initialization phase of an object, where fields start as being null? More on that soon.
  2. Libraries: we want to support null specifications for libraries that have no null annotations in their source code.
  3. JSR 308: only with JSR 308 will we be able to annotate all occurrences of types, like, e.g., the element type of a collection (think of List)

Please stay tuned as the feature evolves. Feedback including bug reports is very welcome!

Ah, and one more thing in the future: I finally have the opportunity to work out a cool tutorial with a fellow JDT committer: How To Train the JDT Dragon with Ayushman. Hope to see y’all in Reston!

Written by Stephan Herrmann

December 20, 2011 at 22:32

The Essence of Object Teams

leave a comment »

When I write about Object Teams and OT/J I easily get carried away indulging in cool technical details. Recently in the Object Teams forum we were asked about the essence of OT/J which made me realize that I had neglected the high-level picture for a while. Plus: this picture looks a bit different every time I look at it, so here is today’s answer in two steps: short and extended.

Short version:

OT/J adds to OO the capability to define a program in terms of multiple perspectives.

Extended version:

In software design, e.g., we are used to describing a system from multiple perspectives or views, like: structural vs. behavioral views, architectural vs. implementation views, views focusing on distribution vs. business logic, or just: one perspective per use case.

A collage of UML diagrams

© 2009, Kishorekumar 62, licensed under the Creative Commons Attribution-Share Alike 3.0 Unported license.

In regular OO programming this is not well supported, an object is defined by exactly one class and no matter from which perspective you look at it, it always has the exact same properties. The best we can do here is controlling the visibility of methods by means of interfaces.

A person is a person is a person!?

By contrast, in OT/J you start with a set of lean base objects and then you may attach specific roles for each perspective. Different use cases and their scenarios can be implemented by different roles. Visualization in the UI may be implemented by another independent set of roles, etc.

A base seen from multiple perspectives via roles.

Code-level comparison?

I’ve been asked to compare standard OO and OT/J by direct comparison of code examples, but I’m not sure if it is a good idea to do, because OT/J is not intended to just provide a nicer syntax for things you can do in the same way using OO. OT/J – as any serious new programming language should do IMHO – wants you to think differently about your design. For example, we can give an educational implementation of the Observer pattern using OT/J, but once you “think Object Teams” you may not be interested in the Observer pattern any more, because you can achieve basically the same effect using a single callin binding as shown in our Stopwatch example.

So, positively speaking, OT/J wants you to think in terms of perspectives and make the perspectives you would naturally use for describing the system explicit by means of roles.

OTOH, a new language is of course only worth the effort if you have a problem with existing approaches. The problems OT/J addresses are inherently difficult to discuss in small examples, because they are:

  1. software complexity, e.g., with standard OO finding a suitable decomposition where different concerns are not badly tangled with each other becomes notoriously difficult.
  2. software evolution, e.g., the initial design will be challenged with change requests that no-one can foresee up-front.

(Plus all the other concerns addressed)

Both, theory and experience, tell me that OT/J excels in both fields. If anyone wants to challenge this claim using your own example, please do so and post your findings, or lets discuss possible designs together.

One more essential concept

I should also mention the second essence in OT/J: after slicing elements of your program into roles and base objects we also support to re-group the pieces in a meaningful way: as roles are contained in teams, those teams enable you to raise the level of abstraction such that each user-relevant feature, e.g., can be captured in exactly one team, hiding all the details inside. As a result, for a high-level design view you no longer have to look at diagrams of hundreds of classes but maybe just a few tens of teams.

Hope this helps seeing the big picture. Enough hand-waving for today, back to the code! 🙂

Written by Stephan Herrmann

September 20, 2011 at 15:39

Combination of Concerns

with one comment

If you are interested in more than two of these …

… you’ll surely enjoy this:

Hands-on introduction to Object Teams

See you at

Written by Stephan Herrmann

March 17, 2011 at 16:52

Null annotations: prototyping without the pain

with 3 comments

So, I’ve been working on annotations @NonNull and @Nullable so that the Eclipse Java compiler can statically detect your NullPointerExceptions already during compile time (see also bug 186342).

By now it’s clear this new feature will not be shipped as part of Eclipse 3.7, but that needn’t stop you from trying it, as I have uploaded the thing as an OT/Equinox plugin.

Behind the scenes: Object Teams

Today’s post shall focus on how I built that plugin using Object Teams, because it greatly shows three advantages of this technology:

  • easier maintenance
  • easier deployment
  • easier development

Before I go into details, let me express a warm invitation to our EclipseCon tutorial on Thursday morning. We’ll be happy to guide your first steps in using OT/J for your most modular, most flexible and best maintainable code.

Maintenance without the pain

It was suggested that I should create a CVS branch for the null annotation support. This is a natural choice, of course. I chose differently, because I’m tired of double maintenance, I don’t want to spend my time applying patches from one branch to the other and mending merge conflicts. So I avoid it wherever possible. You don’t think this kind of compiler enhancement can be developed outside the HEAD stream of the compiler without incurring double maintenance? Yes it can. With OT/J we have the tight integration that is needed for implementing the feature while keeping the sources well separated.

The code for the annotation support even lives in a different source repository, but the runtime effect is the same as if all this already were an integral part of the JDT/Core. Maybe I should say, that for this particular task the integration using OT/J causes a somewhat noticable performance penalty. The compiler does an awful lot of work and hooking into this busy machine comes at a price. So yes, at the end of the day this should be re-integrated into the JDT/Core. But for the time being the OT/J solution well serves its purpose (and in most other situations you won’t even notice any impact on performance plus we already have further performance improvements in the OT/J runtime in our development pipeline).

Independent deployment

Had I created a branch, the only way to get this to you early adopters would have been via a patch feature. I do have some routine in deploying patch features but they have one big drawback: they create a tight dependency to the exact version of the feature which you are patching. That means, if you have the habit of always updating to the latest I-build of Eclipse I would have to provide a new patch feature for each individual I-build released at Eclipse!

Not so for OT/Equinox plug-ins: in this particular case I have a lower bound: the JDT/Core must be from a build ≥ 20110226. Other than that the same OT/J-based plug-in seemlessly integrates with any Eclipse build. You may wonder, how can I be so sure. There could be changes in the JDT/Core that could break the integration. Theoretically: yes. Actually, as a JDT/Core committer I’ll be the first to know about those changes. But most importantly: from many years’ experience of using this technology I know such breakage is very seldom and should a problem occur it can be fixed in the blink of an eye.

As a special treat the OT/J-based plug-in can even be enabled/disabled dynamically at runtime. The OT/Equinox runtime ships with the following introspection view:

Simply unchecking the second item dynamically disables all annotation based null analysis, consistently.

Enjoyable development

The Java compiler is a complex beast. And it’s not exactly small. Over 5 Mbytes of source spread over 323 classes in 13 packages. The central package of these (ast) comprising no less than 109 classes. To add insult to injury: each line of this code could easily get you puzzling for a day or two. It ain’t easy.

If you are a wizard of the patches feel free to look at the latest patch from the bug. Does that look like s.t. you’d like to work on? Not after you’ve seen how nice & clean things can be, I suppose.

First level: overview

Instead of unfolding the package explorer until it shows all relevant classes (at what time the scrollbar will probably be too small to grasp) a quick look into the outline suffices to see everything relevant:

Here we see one top-level class, actually a team class. The class encapsulates a number of role classes containing the actual implementation.

Navigation to the details

Each of those role classes is bound to an existing class of the compiler, like:

   protected class MessageSend playedBy MessageSend { ...
(Don’t worry about identical names, already from the syntax it is clear that the left identifier MessageSend denotes a role class in the current team, whereas the second MessageSend refers to an existing base class imported from some other package).

Ctrl-click on the right-hand class name takes you to that base class (the packages containing those base classes are indicated in the above screenshot). This way the team serves as the single point of reference from which each affected location in the base code can be reached with a single mouse click – no matter how widely scattered those locations are.

When drilling down into details a typical roles looks like this:

The 3 top items are  “callout” method bindings providing access to fields or methods of the base object. The bottom item is a regular method implementing the new analysis for this particular AST node, and the item above it defines a  “callin” binding which causes the new method to be executed after each execution of the corresponding base method.

Locality of new information flows

Since all these roles define almost normal classes and objects, additional state can easily be introduced as fields in role classes. In fact some relevant information flows of the implementation make use of role fields for passing analysis results directly from one role to another, i.e., the new analysis mostly happens by interaction among the roles of this team.

Selective views, e.g., on inheritance structures

As a final example consider the inheritance hierarchy of class Statement: In the original this is a rather large tree:

Way too large actually to be fully unfolded in a single screenshot. But for the implementation at hand most of these
classes are irrelevant. So at the role layer we’re happy to work with this reduced view:

This view is not obtained by any filtering in the IDE, but that’s indeed the real full inheritance tree of the role class Statement. This is just one little example of how OT/J supports the implementation of selective views. As a result, when developing the annotation based null analysis, the code immediately provides a focused view of everything that is relevant, where relevance is directly defined by design intention.

A tale from the real world

I hope I could give an impression of a real world application of OT/J. I couldn’t think of a nicer structure for a feature of this complexity based on an existing code base of this size and complexity. Its actually fun to work with such powerful concepts.

Did I already say? Don’t miss our EclipseCon tutorial 🙂

Hands-on introduction to Object Teams

See you at

Written by Stephan Herrmann

March 14, 2011 at 00:18