On event-based optical flow detection.
Bottom Line:
Furthermore, a stage of surround normalization is incorporated.Together with the filtering this defines a canonical circuit for motion feature detection.The theoretical analysis shows that such an integrated circuit reduces motion ambiguity in addition to decorrelating the representation of motion related activations.
View Article:
PubMed Central - PubMed
Affiliation: Faculty of Engineering and Computer Science, Institute of Neural Information Processing, Ulm University Ulm, Germany.
ABSTRACT
Event-based sensing, i.e., the asynchronous detection of luminance changes, promises low-energy, high dynamic range, and sparse sensing. This stands in contrast to whole image frame-wise acquisition by standard cameras. Here, we systematically investigate the implications of event-based sensing in the context of visual motion, or flow, estimation. Starting from a common theoretical foundation, we discuss different principal approaches for optical flow detection ranging from gradient-based methods over plane-fitting to filter based methods and identify strengths and weaknesses of each class. Gradient-based methods for local motion integration are shown to suffer from the sparse encoding in address-event representations (AER). Approaches exploiting the local plane like structure of the event cloud, on the other hand, are shown to be well suited. Within this class, filter based approaches are shown to define a proper detection scheme which can also deal with the problem of representing multiple motions at a single location (motion transparency). A novel biologically inspired efficient motion detector is proposed, analyzed and experimentally validated. Furthermore, a stage of surround normalization is incorporated. Together with the filtering this defines a canonical circuit for motion feature detection. The theoretical analysis shows that such an integrated circuit reduces motion ambiguity in addition to decorrelating the representation of motion related activations. No MeSH data available. Related in: MedlinePlus |
Related In:
Results -
Collection
License getmorefigures.php?uid=PMC4403305&req=5
Mentions: When this gray-level transition moves through the origin at time t = 0 it generates a slanted line with normal n in the x–t-space (c.f. Figure 3). The speed s of the moving contrast edge is given by s = sin(θ)/cos(θ), where θ is the angle between n and the x-axis (this is identical to the angle between the edge tangent and the t–axis). For a stationary gray-level edge (zero speed) we get θ = 0 (i.e., the edge generated by the DL transition in the x–t-domain is located on the t-axis). Positive angles θ ∈ (0°, 90°) (measured in counterclockwise direction) define leftward motion, while negative angles define rightward motion. For illustrative purposes, we consider an DL contrast that is moving to the right (c.f. Figure 3). The spatio-temporal gradient is maximal along the normal direction n = (cos θ, sin θ)T. The function g(x; t) describing the resulting space-time picture of the movement in the x-t-space is thus given aswith x⊥ = x · cos θ − t · sin θ. The respective partial temporal and spatial derivatives are given as(5)∂∂tgσθ(x;t)=−c2πσexp(−x⊥22σ2) · sinθ,(6)∂∂xgσθ(x;t) =c2πσexp(−x⊥22σ2) · cosθ. |
View Article: PubMed Central - PubMed
Affiliation: Faculty of Engineering and Computer Science, Institute of Neural Information Processing, Ulm University Ulm, Germany.
No MeSH data available.