currently,i am trying to perform tracking and rendering on two machines,that is,machine A is responsible for captureing video using a webcam plugged in it,at the same time, ARToolKit running on A computes the tracking data(i.e. the position and orientation of the cam relative to the marker) according to every frame,then the raw video along with the derived tracking data are sent to machine B,and B will performs the rendering according to the reception.
so my question is how to ensure the video and corresponding tracking data arrive sychronously so as not to produce offset between the marker and the virtual object.
does anyone engage in this matter?
thank you,any help would be appreciated.
looking forward to hearing from you.
The Key Laboratory of Virtual Reality Technology,Ministry of Education
Beijing University of Aeronautics and Astronautics,China