Please use this identifier to cite or link to this item: http://hdl.handle.net/1946/16870
In this thesis, we explore how objects affect the space around them. We show that spatial information is extracted from even completely novel objects. Information derived from the shape of objects is swiftly and automatically integrated into a variety of processes, such as the allocation of visual attention, the programming of eye movements, and the perception of motion. We provide evidence supporting that the lateral intraparietal area (LIP) of the macaque is able to extract such spatial information from objects. We also show that IPS1, the putative human homologue of LIP, can represent space not just in pure retinotopic coordinates but can code for space relative to the location of an object.