scholarly journals Vision Measurement Scheme Using Single Camera Rotation

2013 ◽  
Vol 2013 ◽  
pp. 1-7 ◽  
Author(s):  
Shidu Dong

We propose vision measurement scheme for estimating the distance or size of the object in static scene, which requires single camera with 3-axis accelerometer sensor rotating around a fixed axis. First, we formulate the rotation matrix and translation vector from one coordinate system of the camera to another in terms of the rotation angle, which can be figured out from the readouts of the sensor. Second, with the camera calibration data and through coordinate system transformation, we propose a method for calculating the orientation and position of the rotation axis relative to camera coordinate system. Finally, given the rotation angle and the images of the object in static scene at two different positions, one before and the other after camera rotation, the 3D coordinate of the point on the object can be determined. Experimental results show the validity of our method.

2011 ◽  
Vol 50-51 ◽  
pp. 468-472
Author(s):  
Chun Feng Liu ◽  
Shan Shan Kong ◽  
Hai Ming Wu

Digital cameras have been widely used in the areas of road transportation, railway transportation as well as security system. To address the position of digital camera in these fields this paper proposed a geometry calibration method based on feature point extraction of arbitrary target. Under the meaning of the questions, this paper first defines four kinds of coordinate system, that is the world coordinate system. The camera's optical center of the coordinate system is the camera coordinate system, using the same point in different coordinate system of the coordinate transformation to determine the relationship between world coordinate system and camera coordinate. And thus determine the camera's internal parameters and external parameters, available transformation matrix and translation vector indicated by the camera's internal parameters of the external parameters and the establishment of a single camera location model. According to the model, using the camera's external parameters to be on the target circle center point in the image plane coordinates.


2021 ◽  
Vol 13 (10) ◽  
pp. 1982
Author(s):  
Binhu Chai ◽  
Zhenzhong Wei

The mobile vision measurement system (MVMS) is widely used for location and attitude measurement in aircraft takeoff and landing, and its on-site global calibration is crucial to obtaining high-accuracy measurement aimed at obtaining the transformation relationship between the MVMS coordinate system and the local-tangent-plane coordinate system. In this paper, several new ideas are proposed to realize the global calibration of the MVMS effectively. First, the MVMS is regarded as azimuth and pitch measurement equipment with a virtual single image plane at focal length 1. Second, a new virtual omnidirectional camera model constructed by three mutual orthogonal image planes is put forward, which effectively resolves the problem of global calibration error magnification when the angle between the virtual single image plane and view axis of the system becomes small. Meanwhile, an expanded factorial linear method is proposed to solve the global calibration equations, which effectively restrains the influence of calibration data error. Experimental results with synthetic data verify the validity of the proposed method.


An analysis of compound rotations, such as occur in eulerian cradles, is presented in terms of a calculus of rotation axes, without reference to the associated coordinate transformations. The general case of three rotation shafts mounted on one another, with any relation between them at datum zero, is presented. The problem and its solution may be represented entirely in terms of a plane octagon in which four sides have directions that are instrumental constants and the other four sides have lengths that are instrumental constants. When the first four sides are given lengths that express both the rotation angle and the axial direction of the required rotation, then the remaining four sides have directions that directly express the rotations in the drive shafts, that will generate the required rotation. Analytic expressions are given for the shaft setting angles in the general case. If the first and third axes are parallel and the intermediate one perpendicular to these at datum zero (as in the four-circle diffractometer) then these reduce to θ 1 = arctan ( μ, σ ) + [arctan ( λ , v ) - ψ -½8π], θ 2 = 2 s arcsin ( λ 2 + v 2 )½, θ 3 = ( μ, σ ) - [arctan ( λ , v ) - ψ - ½8π], s = ± 1, 0 ≤ arcsin ( λ 2 + v 2)½ ≤ ½π, in which λ, μ, v and σ are the four components of a rotation vector constructed such that λ, μ and v are the direction cosines of the rotation axis multiplied by sin½ θ for a rotation angle θ and σ is cos½ θ . ψ is a constant determined by the choice of directions to which λ and v are measured. The results for the general case are also expressed in terms of more conventional variables.


Author(s):  
K. Al-Durgham ◽  
D. D. Lichti ◽  
I. Detchev ◽  
G. Kuntze ◽  
J. L. Ronsky

A fundamental task in photogrammetry is the temporal stability analysis of a camera/imaging-system’s calibration parameters. This is essential to validate the repeatability of the parameters’ estimation, to detect any behavioural changes in the camera/imaging system and to ensure precise photogrammetric products. Many stability analysis methods exist in the photogrammetric literature; each one has different methodological bases, and advantages and disadvantages. This paper presents a simple and rigorous stability analysis method that can be straightforwardly implemented for a single camera or an imaging system with multiple cameras. The basic collinearity model is used to capture differences between two calibration datasets, and to establish the stability analysis methodology. Geometric simulation is used as a tool to derive image and object space scenarios. Experiments were performed on real calibration datasets from a dual fluoroscopy (DF; X-ray-based) imaging system. The calibration data consisted of hundreds of images and thousands of image observations from six temporal points over a two-day period for a precise evaluation of the DF system stability. The stability of the DF system – for a single camera analysis – was found to be within a range of 0.01 to 0.66 mm in terms of 3D coordinates root-mean-square-error (RMSE), and 0.07 to 0.19 mm for dual cameras analysis. It is to the authors’ best knowledge that this work is the first to address the topic of DF stability analysis.


2011 ◽  
Vol 2011 ◽  
pp. 1-16
Author(s):  
Yuchuan Wei ◽  
Hengyong Yu ◽  
Ge Wang

This paper provides auxiliary results for our general scheme of computed tomography. In 3D parallel-beam geometry, we first demonstrate that the inverse Fourier transform in different coordinate systems leads to different reconstruction formulas and explain why the Radon formula cannot directly work with truncated projection data. Also, we introduce a gamma coordinate system, analyze its properties, compute the Jacobian of the coordinate transform, and define weight functions for the inverse Fourier transform assuming a simple scanning model. Then, we generate Orlov's theorem and a weighted Radon formula from the inverse Fourier transform in the new system. Furthermore, we present the motion equation of the frequency plane and the conditions for sharp points of the instantaneous rotation axis. Our analysis on the motion of the frequency plane is related to the Frenet-Serret theorem in the differential geometry.


2016 ◽  
Vol 13 (10) ◽  
pp. 1650116 ◽  
Author(s):  
Derya Kahvecí ◽  
Yusuf Yayli ◽  
Ísmaíl Gök

The aim of this paper is to give the geometrical and algebraic interpretations of Euler–Rodrigues formula in Minkowski 3-space. First, for the given non-lightlike axis of a unit length in [Formula: see text] and angle, the spatial displacement is represented by a [Formula: see text] semi-orthogonal rotation matrix using orthogonal projection. Second, we obtain the classifications of Euler–Rodrigues formula in terms of semi-skew-symmetric matrix corresponds to spacelike, timelike or lightlike axis and rotation angle with the help of exponential map. Finally, an alternative method is given to find rotation axis and the Euler–Rodrigues formula is expressed via split quaternions in Minkowski 3-space.


1999 ◽  
Vol 46 (4) ◽  
pp. 1055-1061 ◽  
Author(s):  
T. Farncombe ◽  
A. Celler ◽  
D. Noll ◽  
J. Maeght ◽  
R. Harrop

1995 ◽  
Vol 13 (7) ◽  
pp. 713-716 ◽  
Author(s):  
M. A. Hapgood

Abstract. Raw data on spacecraft orbits and attitude are usually supplied in "inertial" coordinates. The normal geocentric inertial coordinate system changes slowly in time owing to the effects of astronomical precession and the nutation of the Earth's rotation axis. However, only precession produces a change that is significant compared with the errors in determining spacecraft position. We show that the transformations specified by Russell (1971) and Hapgood (1992) are strictly correct only if the epoch-of-date inertial system is used. We provide a simple formula for estimating the error in the calculated position if the inertial system for some other epoch is used. We also provide a formula for correcting inertial coordinates to the epoch-of-date from the standard fixed epoch of J2000.0.


Sign in / Sign up

Export Citation Format

Share Document