Many features have been proposed for encoding the input signal from digital pens and touch-based interaction. They are widely used for analyzing and classifying handwritten texts, sketches, or gestures. Although they are well defined mathematically, many features are non-trivial and therefore difficult to understand for a human. In this paper, we present an application that visualizes a subset from 114 digital pen features in real-time while drawing. It provides an easy-to-use interface that allows application developers and machine learning practitioners to learn how digital pen features encode their inputs, helps in the feature selection process, and enables rapid prototyping of sketch and gesture classifiers.
2022-09-19
2024-12-11