Web navigation on touchpods (part 1)

Today I managed to get back to the team.sPod code. I spend half the afternoon to re-factor some of Dirk's code for browsing the team.sPace contributions on the iPhone and on the touchpod. By accident I found out that with the latest version of safari web applications can also take full advantage of touch gestures on these two iProducts. So instead of refactoring the interface towards the latest wireframes, Dirk and I spent the rest of the afternoon to add some finger food to team.sPod.

For those who don't know, team.sPod is the iVersion of team.sPace. It is designed to work as a featured web-application on Apple's handheld devices. One of the challenges is the access to the contributions, which come with titles, tags, and descriptions. We decided to devote an entire screen to each contribution and use a page flipping metaphor to allow the users to navigate through the contents.

Originally (in January that was), I wanted to have touch screen navigation support in team.sPod, but at that time the safari version on my touchpod didn't support it. In order to work around this limitation, Dirk implemented a rather complex set of invisible DIVs which at least mimicked that idea through taps on the left and right edge of the screen. However, this workaround did not always worked as expected and was too clumsy for being an elegant solution. By the way, this was not Dirk's fault, but part of the technical problem that came hand in hand with the work-around.

After upgrading yesterday our official touchpods in the office to the latest firmware, I played a bit with "dashcode" for getting familiar with the development tools for the iPhone. Then I discovered event hooks for capturing and handling the touch events from within web-applications. This morning I played a bit further and started to dive into Dirk's original code for the front end. One important lesson I learned today was, that there are actually no really good tools for developing web-applications that can compete with VIM, Emacs, or -if you like- with Eclipse. Dashcode is certainly good for learning the general principles of touchy web-applications, but if it comes to cool and/or serious applications, it is likely that it would be among the later choices.

Anyway, enough complained ;-)

Handling touch events in javascript is really simple. Of course that does not imply that it is intuitive, too as we learn very soon. In order to capture a touch event one can hook into the starttouch and endtouch events of safari, just as one would do it to catch the click or drag events. The related event handler is prototyped to accept an event object. Here it starts to get a bit strange: different to other events, the event object is not directly related to the touch event. Rather it is a container that holds several lists in which the actual events are stored. Basically, there are three event lists: touches, targetTouches, and changedTouches. Intuitively, one would expect to find the event information in the touches list or at least in the targetTouches, but finally we identified that the only usable information in available in the changedTouches list. (see also sitepen for some more details)

In order to identify the different types of flipping and scrolling, we captured the starttouch and the endtouch event. The starttouch event is only required as an anchor for the direction of the gesture. If someone flips over to another content, the difference of the x-starting coordinate between the start and the end point of the gesture indicate the direction of the page flipping (forward or backwards). Thus, the starttouch event sets the anchor for the gesture and when the endtouch event is triggered, we compare that anchor with the data that we get from the second event. In fact the endtouch event is a hub to identify different types of gestures. For example to differ page horizontal flipping from vertical scrolling.

In team.sPod's current version we only check the x coordinates. The drawback of this implementation is that it is not possible to scroll larger content. In order to provide better accessibility of the information we have to integrate some heuristics to determine the intention of a gesture. I.e. to find out if a user wanted scrolling or page flipping.

After all, the support of finger gestures hold some great potential to create really neat web-applications for the iPhone and the touchpod, and I am excited to go ahead with the project.