As part of a recent video recording session I created a Location API example, this was to demonstrate just how simple it is to add Context-awareness to your applications. Now context-awareness is something that I know quite a bit about, it was actually the subject of my dissertation. Back then I created something called “BlueSpot”, which was a language and Server->Mobile Client system that would provide contextual information and learn about you over time.
In practice the idea was quite simple, shops, Bus Stops and even your own home could begin to
channel information about you and act accordingly ok ok, it was about mobile advertising. The idea came from an incredible paper called Context-aware computing applications (Schilt 94), and so I decided to build a context-aware application for mobile phones.
The example that I created was centered on a record store that could send messages to you about your favourite artist, and yet generic enough that a bus stop could learn your route and ensure that you get to work on time. Context-awareness, I wrote: “is the sum of inferences derived from physical, temporal, emotional needs mixed with intention”. Admittedly I didn’t really know what I meant at the time, there were simply no real-world examples around.
Though let’s break it down..
Example: Matt’s heavy night out
- Matt arrives home at 3am having had too much to drink on a Thursday night
- Like most people in the UK, Matt is in the office normally by 9am
- On Friday he doesn’t have that much to do in the morning
So how would a context-aware application help out? Well given a bit of fine tuning over time, you can imagine that a context-aware application might be able to wake Matt up a little later on Friday. It may even have ordered a cab, or sent a lovely excuse email to the boss while Matt slept in.
“Hi, I’m stuck at the dentist for an hour this morning. Matt” – Obviously the app would change “dentist” for a suitable, fresh, excuse each time
New devices, new Contexts
With our continued drive to bring the full Flash Player 10.1 and AIR to devices, including desktops, netbooks, tablets and mobile phones you can imagine new sets of contexts appearing. Arguably this could just mean that we’re moving away from consolidated devices with all features, although I like to think that consolidation is still happening. So you mobile phone will have a camera just in case, but it’s unlikely to be your camera of choice.
Applications are moving in much the same way, we’ve seen a trend towards the availability of information across screens and on different devices. That said, these applications are going to be created with their context in mind.
So we do need to extend the vision for context-aware applications to include the device, it’s characteristics, and the human interface guidelines set out for the experience. Google’s Eric Schmidt unveiled the new Google mantra of “Mobile First”, clearly a sign that creating applications for use in the mobile context will ultimately create better applications that scale across devices.
Context-awareness is not about moving a few buttons around, cutting down on a few components and resizing videos. It’s ultimately about understanding your users, addressing their needs in the context of the moment and enabling them to gain access to your content from any (relevant) screen.
This is the reason that we added Location APIs to AIR, iPhone and FL4, because location is a key part of context-awareness.
So hopefully that little introduction has sparked some imaginative ideas around Context-aware applications. To that end I have provided below a little example application called FindMe. It doesn’t do anything that special, but shows you a Google Map of your location and then allows you to search for places. To use it you’ll need to get a Google API Key
You could extend it to track your position over time, or maybe guess where you are based on your past history?
The idea is to get started and learn how to produce applications for users in different contexts, and it would be great to see your ideas and results!