This should be basic, baby stuff.
Here's a little snippet of my code:
public Vector2 SubjectScreenOffset = new Vector2(-0.125f, 0.0f);//offset from the centre of the screen. a value of +-0.5 will reach the edge of the appropriate axis
void CalculateCameraOffset()
{
Vector2 screenSize = new Vector2(cam.pixelWidth, cam.pixelHeight);
print("SubjectScreenOffset = " + SubjectScreenOffset);
print("ScreenSize = " + screenSize);
print("Testvalue: " + (SubjectScreenOffset.x * screenSize.x));
Vector2 screenPixelPosition = new Vector2((SubjectScreenOffset.x * screenSize.x) + (screenSize.x * 0.5f), (SubjectScreenOffset.y * screenSize.y) + (screenSize.y * 0.5f));
print("ScreenPixelPosition = " + screenPixelPosition);
//Vector3 desiredPlayerPos = cam.ScreenToWorldPoint();
}
and the first print results: "SubjectScreenOffset = (0.1, 0.0)"
Clearly that is not what i declared as the initial value. Two decimal places have been chopped off, and the sign is lost. What's going on ?
↧