Abstract
We present an algorithm for autonomous network calibration of visual sensor networks, which become more and more pervasive since they can be found in various everyday life environments. The proposed algorithm works in a fully decentralized way and minimizes usage of cost-intensive vision algorithms. To achieve network calibration, our approach relies on jointly detected objects and geometric relations between camera nodes. Distances and angles are the only information required to be exchanged between nodes. The process works iteratively until cameras have determined the relative position and orientation of their neighbors. Preliminary results are demonstrated using our visual sensor network simulator.