This paper presents a new Electronic Travel Aid (ETA) ‘Acoustic Prototype’ which is especially suited to facilitate the navigation of visually impaired users. The device consists of a set of 3-Dimensional Complementary Metal Oxide Semiconductor (3-D CMOS) image sensors based on the three-dimensional integration and Complementary Metal-Oxide Semiconductor (CMOS) processing techniques implemented into a pair of glasses, stereo headphones as well as a Field-Programmable Gate Array (FPGA) used as processing unit. The device is intended to be used as a complementary device to navigation through both open known and unknown environments. The FPGA and the 3D-CMOS image sensor electronics control object detection. Distance measurement is achieved by using chip-integrated technology based on the Multiple Short Time Integration method. The processed information of the object distance is presented to the user via acoustic sounds through stereophonic headphones. The user interprets the information as an acoustic image of the surrounding environment. The Acoustic Prototype transforms the surface of the objects of the real environment into acoustical sounds. The method used is similar to a bat's acoustic orientation. Having good hearing ability, with few weeks training the users are able to perceive not only the presence of an object but also the object form (that is, if the object is round, if it has corners, if it is a car or a box, if it is a cardboard object or if it is an iron or cement object, a tree, a person, a static or moving object). The information is continuously delivered to the user in a few nanoseconds until the device is shut down, helping the end user to perceive the information in real time.