Hey all,
I am working on a program for a webcam based touchscreen:http://rp181.wordpress.com/2010/01/0...h-touchscreen/

Before explaining, here is a diagram of the setup:

Right now, i simply took 2 calibration points (the top left and right corners of the monitor), for the x axis (will work on y axis after). I calculated the distance between the points in the pixture in pixels. Lets say it ended up being 200 pixels.

The program then takes the input of the screen resolution. Mine happens to be 1920 by 1080.

I did 1920/200 = 9.6

Next, it finds out the current location of the IR beacon. Lets pretend it is 100 pixels from the edge of the left most calibration point.

9.6 * 100 = 960

However, the perspective distortion is messing up the final coordinates. Where my current setup is, it is about 475 pixels.

My question:

How would i go about factoring in the distortion? How do i know how much smaller it is with distance? Will i be able to use basic trig from there?

Any help would be appreciated.