Breaking it down
My previous post was a bit disorganised. I have too many opinions about academia, some very strong, that I couldn’t focus. Thus, I decided to do an analysis of the next paper. By the time I have finished this post, I have finished reading the paper: "Mobile Computing with the Rover Toolkit" by Anthony D. Joseph, Student Member, IEEE, Joshua A. Tauber, Student Member, IEEE and M. Frans Kaashoek, Member, IEEE. Published in IEEE TRANSACTIONS ON COMPUTERS, VOL. 46, NO. 3, MARCH 1997
Let’s start with what they set out to show in this paper:
In this paper, we describe the Rover toolkit, a set of software tools that supports applications that operate obliviously to the underlying environment, while also enabling the construction of applications that use awareness of the mobile environment to adapt to its limitations.
Quite high, and again I flag for the “trying to take over the world” stance. Moreover, Lalith just popped in and we got into the discussion: Should researchers really produce applications?
They’re apparently evaluating this by “We illustrate the effectiveness of the toolkit using a number of distributed applications, each of which
runs well over networks that differ by three orders of magnitude in bandwidth and latency.” It remains to see in which context though. Immediately afterwards the author’s goes on about a number of assumptions made, and a lengthy description of the characteristics of mobile computing is presented. Too many details, especially regarding data consistency, are presented in my opinion. Eventually they reach some kind of conclusion:
[…] a mobile-aware application can store not only the value of a write, but also the operation associated with the write. That operation can include any relevant context. Storing the operation allows the application to use application-specific semantic and contextual information;
Great. Basically, applications need to be aware of what’s going on underneath (i.e what network connectivity do we have? How much battery is left?) so that it can optimise performance accordingly.
Here’s a summary of some implementation details
- The Rover toolkit uses a client-server model which provides optimistic concurrency control and caching.
- It is possible to transfer code and data for computation at a remote location.
- Remote procedure calls can be queued.
- Servers can also be run on mobile devices.
- The main challenge for the programmer is to define so called relocatable dynamic objects, the communication between clients and servers, as well as any conflict resolution.
Here it is worth noting: does this really make life simpler for the programmer? Surely, it seems they don’t have to worry about moving the actual code. But how flexible and adaptable is it? What trade-offs do they have to make?
They present four main results:
- QRPC is well suited to intermittently connected environments.
- Using RDO’s enable remote computation of heavy tasks and reduces latency and bandwidth usage.
- Porting to Rover apparently requires little work (only three weeks in one case… ehrm… ok?)
- Mobile-aware applications using Rover perform better on slow networks compared to their original versions
- (then they mention a fifth despite only saying four results) UI’s are faster too.
In related works we find that Rover is, apparently, also the first toolkit to support both the development of mobile-aware applications and, so called, proxies to enable untouched applications to benefit of the “mobile-awareness”. (This, once again, feels like a marketing trick… but sure let’s go with it.)
I won’t describe anything from the implementation details. To be honest, I didn’t bother reading it too carefully either. Let me add a thought that emerged from these sections though: there are potentially some ideas that have migrated into “real” products here. The details are also irrelevant since the paper was written in 1997 and much have changed since.
Section 5 presents the programmer with something similar to guidelines for how to port, or integrate, Rover to their mobile applications. It looks like a good overview of what steps are required for using Rover. Question: there has gone a significant amount of thinking into this project, why spoil it with miniscule details like:
"The application developer also must decide which mechanisms to use for notifying users of the cache status of displayed data. In the e-mail application, color is used to distinguish operations that have not been propagated to a server."
Secondly, that is not very unintrusive. How many users knows what a cache is?
A table showing number of lines changed to integrate/add Rover to existing and new applications. Convincing? It is good though to see how much work is required, or at least get an estimation of it.
Lab tests were carried out for evaluation. There is a concise list of hypotheses that they are evaluating. I like. Unfortunately, there are only internal comparisons. We discussed in class one day that it is hard to do benchmarks (in general) in Computer Science because the field is moving too fast. I believe that if you cannot do quantitative measurements, at least provide qualitative assessments.
Their evaluation obviously shows gains (I have only seen a few papers where a solution was disproved… they were an entertaining read!). I’m mostly concerned about their values; 17% doesn’t sound like a significant gain. Is it worth it? Increase of bugs? Their final graph on speedup shows some promising results. There is a significant increase against the original versions (based on a subset sample of tasks). A 7.5 speedup over slow networks is mentioned in the conclusion.
However, they do a bad job of connecting to their original goal.
"We have found it quite easy to adapt applications to use these Rover facilities"
What does that mean by the way? Really, when you’re making qualitative statements, provide a solid argument. Don’t make loose relative statements from your quantitative 7.5 speedups.
In practice, we find the combination of the Rover cache, relocatable dynamic objects, and queued remote procedure calls results in a surprisingly useful system.
Surprise! Now, except for you guys, who did/does? Show me!
Once again, the paper presents some cool ideas, probably genuine and innovative at the time of writing. But seriously, even if this is 14 years old, this still happens today. Perhaps even more because of increased competition.
Footnote: I wrote this post as I was reading the paper in an attempt to track my thoughts. In other words, an experiment.