Mobup. Finally.

Those among you, my loyal readers, who have already given a listen to the podcasted interview on Mobup published by the Podcast Network already know that Mobup – as a DnD pet project – was born specifically to satisfy my own needs on moblogging.

At the time I was an happy (and pretty ignorant on Java for mobiles different implementations) owner of a Sonyericsson T610 cameraphone which – due to a major outage – was later substituted with an old P800 phone.

None of them could however run Mobup due to the impossibility for the J2ME multimedia APIs to control the camera (this is a frequent – yet hopefully slowly disappearing – problem among cell phones), there were nothing we could do to have Mobup running on my devices. The whole world (well, nearly :-) is talking about and taking advantage of an app I invented and I was stuck to the pre-history of moblogging.

Till yesterday.

I eventually decided a small investment in a Nokia 6670 cameraphone which comes equipped with a 1Megapixel camera, 64 MB of available space and various other wonderful capabilities for a prices that is comfortably lower than 250€ which makes now possible for me to moblog using Mobup and to have a personal and direct feedback on the application itself (till today I was forced to use Vincenzo’s device or to rely on our beta testers feedback).

In the coming days I’ll probably have the opportunity to do some serious moblogging also here on Yellowline, and this is one of the things I love more on Mobup: to be (mostly) blogging-platform independent and – doing all the blog posting server side once published on Flickr, it costs absolutely nothing.

Monday morning at the IDII

As previously anticipated I spent a nice morning at the Interaction Design Institute Ivrea in Milan listening to a couple of thesis project (which are now approaching 50% of their completion). I was invited to listen to Vinay Venkatraman presentation: a bright minded Indian guy who came out with this nifty prototype of a new way to interact with web content for visually impairedusers.

The main idea is that actual screen readers (Flash Voice exluded, I dare to say :-) are specifically linear (e.g. they scan the page top to bottom and have the transformed in interactable synthetized voice. full stop.) while we (and with we I mean someone who can see) usually interact with web content in non-linear ways. So Vinay came out with a solution which translates web page elements into different sounds: a TNICK for a form, a PLICK for a paragraph, a TRSTCH for a link and so on; everything is controlled via a motion-feedback enabled roller which is USB connected to the computer and manipulated by the user.

His prototype (which is made for nearly 70% of wood) targets developing countries thus trying to be EXTREMELY cheap to be built and based on open source software and ready-to-build hardware kits. Vinay is probably taking into account that the 100$ laptop is really becoming a smashing hit in the next few years (after the speech I suggested him to take into account the 20$ cellphone too). I tried to test the prototype but – thankyou Murphy – all the app crashed and didn’t re-started; I’m looking forward to retest it soon.

My visit ended with a quick chat with Fabio and a Pizza with JC /thankyou for spreading the word on Flash Voice!), Phil Tabor and Neil Churcher, with whom I had an enlighting discussion on the future of mobile television.

Mobup on tour: first speech

Yesterday we were hosted by the wonderful people at the Milan Java User Group: the seminar room at the Mac shop was really crowded and I was pleased to see a couple of known faces among the public.

Our slides (available online) on the strategy, user experience and technical aspects on Mobup raised a lot of interest (we mobupped some photos while speaking), it’s always cool speaking to smart programmers who make smart questions and suggestion.

We’ve also been so lucky to be invited in another couple of Italian Java events (one of those is really HUGE), more on this in the time coming.