KROSWEB

Home

Blog

Projects

About Me

Resume, skills and projects.

Music

Catmis

Contact

iHome

This is a research project on future home environments conducted by Center for Interactive Spaces involving Bang & Olufsen, A/S, Aarhus University, and the Aarhus School of Architecture.' The project started in august 2003 and ended on the 1st of april 2006.

Prototypes

The CASOME Platform

In the project we have developed a novel home media platform denoted CASOME. It is a general context-aware collaborative software infrastructure, which has been tailored to run in a variety of hardware setups that we anticipate will become pervasive in future homes.

The CASOME application consists of a series of application components that are integrated in a common application framework, which can be configured specifically to each display in the home. The displays may be touch monitors, passive monitors, projectors or sound “displays” (audio system with speakers). The displays may be embedded in tables, walls, projected on floors or ceilings etc.

The application may be configured to control specific media objects like pictures, video, Web pages, music as well as collections of media clips. The application can be configured as a standalone display, a table or a desktop PC depending on the input devices available. E.g. the kitchen display use speech for interacting with the system while cooking and touch for controlling playback and basic organization while the hallway display will display a slideshow of images without interaction possibilities.

In the Organizer configuration of the application (see screenshot below), media clips and collections can be organized, shared, sorted, linked to places and previewed in the built-in players for web, video, music and image content. To the left in the interface is a chest of drawers where users can archive media clips, copy them from other places in the home or access their personal objects for playback or sharing. 

The MediaOrganizer

I have been part of both workshops, conceptual work and the programming of the prototype. The prototype was written in Java with Batik SVG and Quicktime at the frontend and MySQL and UPnP on the backend. The prototype has been evaluated in the lab by bringing in four different families. Later the entire setup was evaluated in a real home in Gram near Skanderborg for two weeks. In March 2006 the prototype was presented at the Summit 06 conference.

Click here to watch a video of the prototype in action. (~42 mb)


MultiLightTracker

In a social setting current computer systems don’t support multiple users in organizing, sharing and viewing content together. Interaction is primarily limited to one user at a time controlling the mouse or keyboard while the others are watching. In the CASOME system this problem has been addressed by giving each user their own pen to interact with the system running on the living room table. This technique is called the MultiLightTracker.

MultiLightTracker is a simple and robust technique for simultaneous tracking of multiple objects on a semi-transparent surface. Behind a semi-transparent surface a camera tracks the location of pens based on colors emitted from a LED, each pen location is associated an identifier based on the unique color. The coordinates and IDs can be used as input to applications and allow multiple users to interact simultaneously.

MultiLightTracker

The multi-user table use required an interface made for a 360 degrees experience allow- ing users to organize alone or together around a table with the application. Collections can be rotated by dragging a “stone” around the canvas of the collection. When the “stone” square is placed in the middle of the canvas the content will be rotated around the middle.

The concept was originally thought out by Jesper Nielsen. Andreas Lykke-Olesen and I worked on the software infrastructure that made it possible to build applications around it. Søren Boll Overgaard worked on the calibration routine with help from Michael Bang .

Click here to watch a video of the MultiLightTracker prototype in action. (~9 mb)


eMote - Gesture Based Interaction

Gesture based interaction is provided through the eMote, which is a one-button gesture based remote control for interaction with music, movies and other media in the home. The eMote makes it possible to record gestures with the device and relate that to controlling playback of clips on the displays e.g. image slideshows, music playlists etc. The current system allows one to turn the music off as the remote is turned upside down, to skip tracks by making a throw gesture, and to turn the volume up and down through vertically tilting the remote itself.

eMote

The eMote is a contextual input device automatically working based on the user’s location in the home. E.g. when the user approaches a display in the home the eMote will be recognized within a few seconds through RSSI value measuring with Bluetooth. If the user wants to pick up a photo and move it to the entrance a grab gesture is performed. The user can now walk to entrance and throw the clip when the eMote has been recognized to add it to the display. If the user picks up playing music or movies the playhead location will be stored until the clip is dropped on a new display where the clip resumes playback. It is also possible to grab an entire collection of clips and move it to another display.


Speech Recognition

We also used Microsoft speech recognition technology. Thomas Riisgaard wrote a .Net application that made it easy to recognize speech commands and send it to an application. This could useful in a kitchen scenario where a remote would not be practical.

People

Links

Project page:
http://www.interactivespaces.net/projects/iHome

Bang & Olufsen:
http://www.bang-olufsen.dk

 

(C) Kaspar Rosengreen Nielsen 2002 - 2017 | Powered by Catmis