In this paper we present a novel approach for generating viewpoint invariant features from single images and demonstrate their application for robust matching over widely separated views. The key idea consists of retrieving building structure from single images and then utlising the recovered 3D geometry to improve the performances of feature extraction and matching. Urban environments usually contain many structured regularities, so that the images of those environments contain straight parallel lines and vanishing points, which can be efficiently exploited for 3D reconstruction. We present an effective scheme to recover 3D planar surfaces using the extracted line segments and their associated vanishing points. The viewpoint invariant features are then computed on the normalized front-parallel views of the obtained 3D planes. The advantages of the proposed approach include: (1) the new feature is very robust against perspective distortions and viewpoint changes due to its consideration of 3D geometry; (2) the features are completely computed from single images and do not need information from additional devices (e.g. stereo cameras, or active ranging devices). Experiments are carried out to demonstrate the proposed scheme ability to effectively handle very difficult wide baseline matching tasks in the presence of repetitive building structures and significant viewpoint changes. Â© 2009 IEEE.