Can someone explain the result of CGRectGetMaxX and CGRectIntersection?

Scenario

Suppose that there are two disjoint rectangles, both 10 points wide and 5 points. The x / y values ​​in the following diagram indicate the coordinates of the corner points of two rectangles.

x/y = 0/0 +------------------+ x/y = 9/0
          |                  |
          |                  |
          |                  |
          |                  |
x/y = 0/4 +------------------+ x/y = 9/4
x/y = 0/5 +------------------+ x/y = 9/5
          |                  |
          |                  |
          |                  |
          |                  |
x/y = 0/9 +------------------+ x/y = 9/9

Here is the code for setting these rectangles:

CGRect rect1 = CGRectMake(0.0f, 0.0f, 10.0f, 5.0f);
CGRect rect2 = CGRectMake(0.0f, 5.0f, 10.0f, 5.0f);

CGRectGetMaxX

The return value for the CGRectGetMaxXCore Graphics function is documented as follows:

The largest x-coordinate value for the rectangle.

Here is a piece of code that uses this function:

CGFloat maxX = CGRectGetMaxX(rect1);

From the documentation, I expect it to maxXbe 9. However, it maxXis equal to 10. Yes?

CGRectIntersection

The return value for the CGRectIntersectionCore Graphics function is documented as follows:

, . , . , CGRectIsNull.

, :

CGRect intersectionRect = CGRectIntersection(rect1, rect2);
bool intersectionRectIsNull = CGRectIsNull(intersectionRect);

, intersectionRectIsNull , rect1 rect2 . intersectionRectIsNull . ???

intersectionRect , :

(CGRect) $0 = origin=(x=0, y=5) size=(width=10, height=0)

, , CGRectIsNull false, intersectionRect ?

(CGRect) $0 = origin=(x=+Inf, y=+Inf) size=(width=0, height=0)

- , ? - , , .

+4
1

. CGRectMake.

10 0 X 10, .

9 10, 4 5.

CGRectMake, . .

, 0 + 10 10, 9 0 + 5 5, 4.

+7

All Articles