Visual perception is an important function in a mobile robot facilitating navigation, object avoidance and directed motion towards or away from visual stimuli. The concepts of active vision, of visual attention and gaze control can speed up image processing for mobile robot navigation and scene analysis. This paper discusses aspects of active vision through hardware and software implementations on the Bunny Robot; and explores implementation issues of Simultaneous Location & Mapping (SLAM) and eye vergence control in a low-power budget humanoid robot. The Bunny Robot has been developed as a platform for teaching robotics. It is a bipedal humanoid design, weighs 2.25kg incl. batteries and is 60cm high including ears. It is based on the Bioloid humanoid servo skeleton from Robotis with Dynamixel AX12+ servos, but additionally has a central card cage that can hold five low power ARM-based processor circuit boards.The skull holds twin cameras, servos for head, eye and ear control, a 2-axis gyro and an FPGA for servo PWM, local processing and video stream pre-processing. The entire design has been modeled in Solidworks, supporting centre of gravity calculations, and animation to check mechanical function. The additional mechanical parts have all been rapid prototyped for ease of modification during development.To read this external content in full, download the paper from the author archives.