Skip to main content
Log in

Quantitative planar region detection

  • Published:
International Journal of Computer Vision Aims and scope Submit manuscript

Abstract

This paper presents a means of segmenting planar regions from two views of a scene using point correspondences. The initial selection of groups of coplanar points is performed on the basis of conservation of two five point projective invariants (groups for which this invariant is conserved are assumed to be coplanar). The five point correspondences are used to estimate a projectivity which is used to predict the change in position of other points assuming they lie on the same plane as the original four. The variance in any points new position is then used to define a distance threshold between actual and predicted position which is used as a coplanarity test to find extended planar regions. If two distinct planar regions can be found then a novel motion direction estimator suggests itself. The projection of the line of intersection of two planes in an image may also be recovered. An analytical error model is derived which relates image uncertainty in a corner's position to genuine perpendicular height of a point above a given plane in the world. The model may be used for example to predict the performance of given stereo ground plane prediction system or a monocular drivable region detection system on and AGV. The model may also be used in reverse to determine the camera resolution required if a vehicle in motion is to resolve obstacles of a given height a given distance from it.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Price excludes VAT (USA)
Tax calculation will be finalised during checkout.

Instant access to the full article PDF.

Similar content being viewed by others

References

  • Coehlo, C., Heller, A., Mundy, J., and Forsyth, D. 1991. An experimental evaluation of projective invariants. In DARPA Image Understanding Workshop, pp. 273–294.

  • Faugeras, O.D. and Maybank, S.J. 1988. Motion from point matches: Multiplicity of solutions. Int. Journal of Computer Vision, 4:225–246.

    Google Scholar 

  • Forsyth, D.A., Mundy, J.L., Zisserman, A.P., and Brown, C.M. 1990. Projectively invariant representations using implicit algebraic curves. In Proc. 1st European Conf. on Computer Vision, pp. 427–436, Springer-Verlag.

  • Harris, C.G. 1987. Determination of ego-motion from matched points. In 3rd Alvey Vision Conference, pp. 189–192.

  • Horn, B.K.P. and Schunk, B.G. 1981. Determining optical flow, Artificial Intelligence, 17:185–203.

    Google Scholar 

  • Horn, B.K.P. and Weldon, E.J. 1988. Direct methods for recovering motion. Int. Journal of Computer Vision, 2:51–76.

    Google Scholar 

  • Kanatani, K. 1991. Geometric Computation for Machine Vision. MIT Press.

  • Longuet-Higgins, H.C. 1984. The visual ambiguity of a moving plane. Proc. R. Soc. Lond., B 223:165–175.

    Google Scholar 

  • Longuet-Higgins, H.C. 1986. The reconstruction of a plane surface from two projections. In Proc. R. Soc. Lond., pp. 399–410.

  • Mundy, J.L. and Zisserman, A. 1992. Geometric Invariance in Computer Vision, Vol. First Edition, MIT Press.

  • Semple, J.G. and Kneebone, G.T. 1952. Algebraic Projective Geometry. Oxford University Press.

  • Smith, S. 1992. Feature based image understanding, D. Phil. Thesis, University of Oxford.

  • Springer, C.E. 1964. Geometry and Analysis of Projective Spaces, Vol. 1. Freeman.

  • Tsai, R.Y. and Huang, T.S. 1981. Estimating three-dimensional motion parameters of a rigid planar patch. IEEE Trans. on Acoustics, Speech and Signal Processing, ASSP-29(6):1147–1152.

    Google Scholar 

Download references

Author information

Authors and Affiliations

Authors

Rights and permissions

Reprints and permissions

About this article

Cite this article

Sinclair, D., Blake, A. Quantitative planar region detection. Int J Comput Vision 18, 77–91 (1996). https://doi.org/10.1007/BF00126141

Download citation

  • Issue Date:

  • DOI: https://doi.org/10.1007/BF00126141

Keywords

Navigation