The discipline known as computer vision strives to extract a sense of meaning from the patterns of light falling across a sensor. Many difficulties hinder the advancement of this cause, ranging from ambiguity as to what form best encapsulates the structure of the information, to the intractable computational bandwidth required to process sensor images of sufficiently high resolution. One promising outlet to overcoming such difficulties is to learn from the single system that has already found solutions to many of the inherent problems: biological vision. I develop a system of neurons inspired by a subset of the visual areas present in the primate brain in order to direct the ?gaze? of a pan-tilt camera in real-time to salient areas demarcated by one or more low-level visual features. The ultimate goal of this work is to achieve a system that can visually navigate a scene in a similar manner and time-frame as humans.