KDE-PIM2: A Distributed Application Framework

A followup discussion to the KDE-PIM roadmap

Overview

This document builds on the ideas proposed in Mike Pilone's KDE-PIM Roadmap. As mentioned in that document KDE has advanced considerably, both in terms of architecture and applications. However, the KDE-PIM framework has remained relatively fragmented. Initial discussions have led to preliminary designs using a central KDE-PIM Server. This server would provide core PIM functionality, as well as a transparent local vs. remote data store. This paper expands on that idea to attempt to provide that type of functionality to all KDE applications, in the form of an application server.

Background

For those not familiar with the application server, distributed application approach I'll provide some background. In the traditional application model applications are completely self contained executables interacting with the "outside world" through some type of network connection or storage facility, be it flat files, DBs, etc. This model was extended to use shared object libraries, or DLLs. Common functionality was shared between applications through underlying system libraries, for example, libkdecore.so. The next step was to provide larger pieces of application functionality, or components, to multiple applications. This lead to the initial version of KParts/Bonobo. Components were made available to applications via Corba. This had several problems mostly related to the complexity introduced by the Corba based architecture. To simplify development and increase performance Corba was removed from the KParts architecture and replaced with a shared library approach. This allowed components to be loaded dynamically by an application.

However, the magic happens when you realize that the application doesn't need to know the specific component it is loading. Rather, the application knows what type of component it wants and asks a service to provide it. For example, the application can request a component that can edit a text file. In KDE this locator service is the KTrader. KTrader will then find a component that says it can edit text files and return the component to the application. From the application's perspective it has simply received a component that implements the TextEditor interface, it does not know which component is actually being used. To the user, this means they can have a full featured word processor for editing text files, or good old sed.

Pushing the limits

While the component based architecture is extremely flexible, it isn't the end of the story. The latest advances in component architectures venture into distributed components. This is basically a matured version of what was originally attempted with Corba. Currently Enterprise Java Beans (EJBs) and Microsoft's DCOM, now .NET, provide a truely distributed component architecture. What's coming around the corner, however, is distributing on a larger scale. This is the idea behind .NET, and somewhat less formally being pursued by some EJB enthusiasts.

What do I mean by distributing on a larger scale? I mean larger in terms of functionality. Rather than having a relatively thick application that requests components to provide very specific functionality the application becomes a relatively thin client and the application itself is distributed amongst various machines, or at least various components on the same machine. Instead of asking for a component that can edit text files, the application may ask for the Calendar Service then provide a thin GUI on top.

Why should I care?

So how does all this apply to KDE? Several problems are brought to light by the proposed KDE-PIM solution:

  1. Users are often mobile, requiring local copies of the data, but wanting to share it with others over a network as well.
  2. Users have many different "interfaces" to their data: PDAs, Cell Phones, local applications, web interfaces, etc.
  3. User need to integrate with other users' applications, whether they are KDE apps, Gnome apps, or Window's apps.
  4. There are countless standards available for exchanging and storing data in a cross platform, cross application manner.
You can quickly see that these problems are not unique to the KDE-PIM. The question becomes how can these problems be addressed in such a way to benefit the whole of KDE? The proposed solution is to introduce an application server to the KDE architecture. Proposals for this application server is discussed later, for now let's continue with the conceptual discussion. KDE applications would make the shift to distributed applications. KOrganizer would become a relatively thin client providing a KDE/Qt GUI, but the core functionality, the Calendar/Event/Todo functionality would be provided by a GUI-less application running on the application server.

This approach offers several benefits:

  1. Provided the application server offers network connectivity the "back end" of the application can exist anywhere, and a user can retrieve data from their home machine from the other side of the country without any special coding from the the front end or backend developer.
  2. Provided the application server offers a standard interface, such as SOAP, the client can be written in any language, on any platform, or even a telnet, or PDA front end.
  3. Advanced network transparency can occur without the application developer (front end or back end) needing to implement it. For example, if a user typically uses a remote application server to retrieve their data but is going to be "on the road" without network access he could ask the application server to migrate the data locally. The remote application server, without any special knowledge of the data it will be migrating, can make a connection to the user's local application server and marshall the data. Once the data has been transfered the thin client is given a new reference (URL, java reference, SOAP envelope, etc) pointing to the local app server. The thin client is not aware the data migrated, however the user can continue to access data while away from the network.
  4. Given a sufficiently advanced application server the above data migration concepts could be extended to application migration. The user could initially try an application while it is running on the server but could later request the application be migrated locally. The remote application server then connects to the local application server, streams the application implementation to the local server and returns a new reference to the client. Again, the client is unaware of what is happening behind the scenes, however, the application is fully independent of the server.

And how exactly could this happen?

Actually, most of the above functionality is already available using an EJB application server. Microsoft provides their implementation of this type of functionality using MTS, and expanded in their upcoming .NET framework. However, as has been pointed out to me several times, KDE is implemented in C++. Unfortunately, to my knowledge Corba is the only C++ compatible solution that would be able to offer anything close to the functionality described above. Corba has already been demonstrated to not be viable for the average KDE desktop (as well as developer). And obviously MTS is not a solution. So, that pushes back towards EJB, or writing our own app server. I have to discourage writing a KDE specific app server given all the options currently available and the need to begin developing applications based on this framework. However, given the current cross language capabilities of DCOP it would be very possible to extend DCOP to support a common cross language protocol, specifically SOAP. KDE development could continue unabated in C++ continuing to use DCOP to make method calls on registered objects. However DCOP would delegate these calls to the newly created app server. In addition, given the excellent java bindings to KDE/Qt, it would be possible to develop a KDE application completely in Java and make use of the real EJBs provided by the server.

In theory, if the application server spoke SOAP and DCOP spoke SOAP the interoperability would extend well beyond distributed KDE applications. In theory a KDE mail program could use a Microsoft .NET backend, or a Windows user could use Outlook to talk to the KDE mail service on the application server. The Gnome folks could build a completely different GnomeMail, but all would use the same back end. Write a Java front end and take your Mail program with you onto any platform that supports Java.

So what's the catch?

While I'm very excited about the above ideas, there are several potential problems:

  1. To my present knowledge the only application server currently available that could provide the desired functionality would be an EJB server. While SOAP would allow us to tie that into the C++ core of KDE, not all platforms supported by KDE have an adequate Java implementation to host a local app server. Developing a custom KDE app server would likely be far beyond the resources currently available to KDE development.
  2. Each KDE user would need his/her own app server running locally. I think this is more of a psychological issue than a real technical issue since Windows NT/2000/XP have been using this approach quietly for some time, quite successfully.
  3. Applications would need to architected and written with the client/server model in mind. While this would be no where near as difficult as the old Corba KParts framework, it would still require a moderate amount of development experience to build a distributed KDE application. However, again I would argue this is largely psychological. It makes sense that building a distributed, multitiered, cross platform application is more involved that developing a simple standalone application.

Obviously this approach would need to be expanded and developed further before any long term decisions can be made. However, this could prove to be quite an impressive accomplishment for an Open Source project and push KDE/Linux into an industry leading role. This would not only keep us competetive with .NET, but allow us to leverage any success .NET may have by integrating with services others may offer.

Please feel free to provide any comments, questions, clarifications, or general complaints to Dan Pilone, and thanks for taking the time to make it to the bottom.


Dan Pilone -- 6/16/2001