In this project, we demonstrate a novel system that tracks a user’s facial motions using two wide-angle cameras mounted discretely at the corners of eyeglasses.
Contextual sensing using wearable cameras has seen a variety of different camera angles proposed to capture a wide gamut of different visual scenes. In this paper, we propose a new camera view that aims to capture the same visual information as many of the camera positions and orientations combined from a single camera view point. The camera, mounted on the corner of a glasses frame is pointing downwards towards the floor, a field-of-view we named Personal Activity Radius (PAR).
We propose a system for tracking horse strides and jumps using a smartphone attached to the horse's saddle. Our system detects and segments individual strides and computes the length of a stride using signal processing and machine learning methods.
In modern showjumping, the success of the horse-rider-pair is measured by the ability to finish a given course of obstacles without penalties. This paper proposes a solution for tracking gaits and jumps using a smartphone attached to the horse's saddle. We propose an event detection algorithm based on Discrete Wavelet Transform and a peak detection to detect jumps and canter strides between fences. We segment the signal to find gait and jump sections, evaluate statistical and heuristic features and classify the segments using different machine learning algorithms.