February 15, 2011

Thesis submission

Hereby I submit my diploma thesis for review:

Till Ballendat. Visualization of and Interaction with Digital Devices around Large Surfaces as a Function of Proximity, Diploma thesis, LMU Munich, February 15 2011


      January 10, 2011

      Proxemic Pong using the Kinect sensor

      I bought a Kinect sensor, built a connector cable and hooked it up to my computer. I got easy access to the skeleton model using the OpenNI Framework with the C# wrapper together with primesense NITE drivers.

      This let's me control the Proxemic Pong game which I built for the Vicons. The tracking is very accurate and it works reliably with two people. Three people is possible, but occlusion becomes a problem of course. It even keeps the assigned ID's if people leave the visible area for a short time. For the Proxemic Pong game I could have all the same features as with the Vicons except the editing of the paddle - which is due to the lack of interactive vertical surfaces ;-).

      The code is all in the svn for the Proxemic Pong game. Its mainly this class: SimpleKinectController.cs

      The embedded video has a poor quality, but you can download one with higher quality: Proxemic Pong with Kinect video

      video

      December 01, 2010

      Picture Canvas

      The Picture Canvas knows about the following artifact in space:
      • Digital Camera: A digital camera that has the functionality of capturing and storing pictures. In my example, the camera also has integrated wi-fi (through an eye-fi card) and is connected to all other digital devices in the homespace. So i can transfer images in between.
      • Vertical Surface: A large vertical surface is used for displaying pictures and lets people interact using touch.
      • Digital Picture Frame: A digital picture Frame is also connected to all other devices (via usb to the main Machine). It simply displays pictures.
      Scenario:
      • Approach: When a person enters the room with his digital camera, a subtle Icon of the camera follows him on the border of the screen, indicating that it is connected. While moving closer, the icon reveals the latest pictures taken with the camera as a stack below the icon. When he stands close to the screen, the pictures arrange like a fly around the camera.
      • PictureBrowsing: By tilting the camera to the left or right, you can navigate back and forth through the pictures. When standing close, the fly rotates and when standing further away, the stack reorders itself accordingly.
      • Transfering Pictures: The application offers several ways of transferring pictures from the camera to the screen depending on distance and orientation. When standing close to the screen, one can use touch and drag images out of it to copy them on the screen canvas. Using the camera to touch the display drops the front image directly onto the canvas on the screen. While standing in front of the screen, but too far away to reach it, the stack of images is also projected into the center of the screen depending on the exact position of the camera. now a person can perform a short acceleration towards the screen to pin the front Image to its Canvas. This is comparable to a throw-gesture. Also it is possible to transfer Images to the digital picture frame in several ways. First the person can just touch the picture frame with the camera to transfer the latest image to it and it will instantly be displayed in the frame. Again using the throw gesture transfers an Image from a distance. To distinguish between a throw gesture towards the large display and the picture frame, we determinate the attention towards an object by considering the position of the persons head and the camera. The currently attended screen has a light border around it. On the border of the large vertical surface we display an icon of the picture frame that is oriented towards the location of the picture frame in the room. It acts as a "tunnel" which means that you can drag images from the screen canvas onto the icon and it will be displayed on the picture frame.
      • Ambient Visualization: Once pictures are transferred onto the displays canvas, people can drag them around and change their visual order (z-index). When sitting down on the couch in front of the display, a slide show plays images according to the visual order on the screen and restores their original position on the canvas when moving towards the screen again.
      • Shooting Pictures: I want to note that at any time it it possible to take pictures with the camera and they will instantly appear in the current visualization.
      Implementation challenges:
      • Transfer of images: the camera loads all images into a specified folder on the main Machine that runs the large display. So the application itself only hast to watch the folder. 
      • Fan visualization: The dynamic visualization of a fan around an icon that animates along the edges was quite a challenge. After several approaches using circles with that have the images distributed along its path, but ran into issues of alignment of images to the border. So we now create an arch segment after placing the largest and the smallest Image on the border of the screen and then distribute the images in between.
      • This figure illustrates how the items on the fan are arranged along a path segment in between the largest and smallest item. Here it is placed on the left border of the display, which was the tricky part

      September 27, 2010

      Follow Canvas - Module

      I have finished a "Follow-Canvas" module using the new Proximity Toolkit. It provides you with a Canvas that aligns itself on a Display so that it keeps the minimal the distance between a specified followed Presence and the FollowCanvas. The main features are:
      • Choose between movement on the border of the display or also allow projection to any position on the display
      • Choose between two strategies for calculating the position. One uses the connection between the Displays and the FollowedPresences Center, projects it into the display plane and places the canvas onto the intersection with the display border. The other one does a projection of the FollowedPresences center onto the display plane and places the canvas to the closest point at the displays border
      • Set a threshold for the movement, so that the canvas only moves when the position-change of the presence exceeds a certain threshold
      • Choose to have animated movement
      • Resize the Canvas by setting its Width and Height, while it automatically repositions.
      • Choose if you want to reposition the canvas center at the border or have the complete canvas inside the display bounds
      Here is a simple example of how to let a black box follow a Hat:

      DeviceFollowCanvas followCanvas = new DeviceFollowCanvas(displayCanvas, "SmartBoard", "pencil");
      followCanvas.Width = 100;
      followCanvas.Height = 100;
      followCanvas.Background = Brushes.Black;
                                      
      //additional Settings
      followCanvas.PositioningBehaviour = DeviceFollowCanvas.ScreenAlignment.Inside;
      followCanvas.AnimatedMovement = true;
      followCanvas.AnimatedScaling = false;
      followCanvas.MovementAnimationSpeed = .3;
      followCanvas.ScalingAnimationSpeed = .3;
      followCanvas.AllowProjection = false;
      followCanvas.PositionCalculationStrategie = DeviceFollowCanvas.CalculationStrategie.AngleFromCenter;
      followCanvas.MovementThreshold = 0.1;

      September 19, 2010

      Region Observer - Module

      I created a module that lets your define Region-Shapes inside the homespace and observes artifact moving in and out.
      • The Regions can be either a box, a box with a hole, a cylinder or a ring
      • An event handler fires whenever a Presence moves into a new Region
      • You can have overlapping regions and set Priorities
      • A Dynamic-Border-Value can be set, which is applied to a shape and prevents quick region changes when someone stands between two regions
      • You can also be notified whenever an artifact is no longer visible to the tracking system

      This is a example, that uses the RegionDetector to observe two people in the homespace and writes their current region to the screen:

      public RegionTeller(){
          InitializeComponent();

          //This shows how we you can define custom regions and observe two people moving in and out of them
          RegionsDetector peopleObserver = new RegionsDetector();

          //add two hats that will be observed as people
          peopleObserver.AddObservedPresence("WhiteHat");
          peopleObserver.AddObservedPresence("BlackHat");
                     
          //define and Add custom regions
          ProximityToolkit.Vector3 displayCenter = new ProximityToolkit.Vector3(-350,0,-89);

          //Touchregion is a rectangular Box around the display
          RectangularRegion touchRegion = new RectangularRegion("touch", 400, 700, 3000);
          touchRegion.Center = displayCenter;
          touchRegion.DynamicRegionBelt = 150;
          peopleObserver.AddRegion(touchRegion,2);           

          //Sitting Region is a Rectangular Box around the couch
          RectangularRegion sittingRegion = new RectangularRegion("sitting", 600, 1500, 1350);
          sittingRegion.Center = new ProximityToolkit.Vector3(1500, 0, 0);
          sittingRegion.DynamicRegionBelt = 150;
          peopleObserver.AddRegion(sittingRegion, 2);

          //Intermediate Interaction Region is a Cylindrical Region around the display below the Touchregion and SittingRegion
          CircularRegion intermediateRegion = new CircularRegion("intermediate", 2000);
          intermediateRegion.Center = displayCenter;
          intermediateRegion.DynamicRegionBelt = 150;
          peopleObserver.AddRegion(intermediateRegion, 1);

          //Far awar Region is a circular belt around the intermediate interaction region below the Touchregion and SittingRegion
          CircularRegion farRegion = new CircularRegion("far away", new RegionValue(2000, 4000));
          farRegion.Center = displayCenter;
          farRegion.DynamicRegionBelt = 150;
          peopleObserver.AddRegion(farRegion, 1);
                     
          //Be notified whenever a person changes in between Regions
          peopleObserver.OnRegionChange += new RegionChangeHandler(peopleObserver_OnRegionChange);
      }
      RegionLabel.Dispatcher.Invoke(System.Windows.Threading.DispatcherPriority.Normal, new Action(
          delegate()
          {
              RegionLabel.Content = presence.Presence.Name + ": " + presence.Region.Name;
          }
          ));  
      }

      September 14, 2010

      synchronized Canvas and synchonized Media Repository

      Nic and me talked about the idea of implementing a module that
      - first offers a shared synchronized repository for any media that can be shared between computers (for example the large screen and a tablet)
      - and on top provides a shared Canvas, that could either be duplicated between two connected screens or that could extend from one screen to the other. This could for example enable easy transfer of a screen-object from a tablet to a large screen or let someone control complete large screen with her tablet.