Attention Modelling
In the domain of computer vision, efforts have been made in modelling the mechanism of human attention, especially the bottom-up attentional mechanism.
Generally speaking, there are two kinds of models to mimic the bottom-up saliency mechanism. One way is based on the spatial contrast analysis.For example, in a center-surround mechanism is used to define saliency across scales, which is inspired by the putative neural mechanism. It has also been hypothesized that some visual inputs are intrinsically salient in certain background contexts and that these are actually task-independent. This model has established itself as the exemplar for saliency detection and consistently used for comparison in the literature.; the other way is based on the frequency domain analysis. This method was first proposed by Hou et. al, this method was called SR, and then PQFT method was also introduced. Both SR and PQFT only use the phase information. In 2012, the HFT method was introduced, and both the amplitude and the phase information are made use of.
Read more about this topic: Attention
Famous quotes containing the words attention and/or modelling:
“If my sons are to become the kind of men our daughters would be pleased to live among, attention to domestic details is critical. The hostilities that arise over housework...are crushing the daughters of my generation....Change takes time, but mens continued obliviousness to home responsibilities is causing women everywhere to expire of trivialities.”
—Mary Kay Blakely (20th century)
“The windy springs and the blazing summers, one after another, had enriched and mellowed that flat tableland; all the human effort that had gone into it was coming back in long, sweeping lines of fertility. The changes seemed beautiful and harmonious to me; it was like watching the growth of a great man or of a great idea. I recognized every tree and sandbank and rugged draw. I found that I remembered the conformation of the land as one remembers the modelling of human faces.”
—Willa Cather (18731947)