NSUInteger in iOS7

113 Views Asked by At

I'm having a really weird problem here with NSUInteger in iOS7,

everything is perfect before iOS7, I guess it's related to the 64-bit support in iOS7.

My code is like this, very simple:

if (blah blah blah) {
    NSUInteger firstRow = 0;
    firstRow = ([self.types containsObject:self.selectedMajorType] ?
        [self.types indexOfObject:self.selectedMajorType] + 1 : 0);
    ...
}

According to my console,

[self.types containsObject:self.selectedMajorType] is true

[self.types indexOfObject:self.selectedMajorType]+1 is 1,

no doubt, and indexOfObject also returns an NSUInteger (according to Apple's document),

here's the screenshot: enter image description here but firstRow is always fking **0

This is so creepy I don't know what's going on with NSUInteger,

can someone help me? Thanks a lot!!

____new finding____ enter image description here

I guess this is the problem? It's weird..

1

There are 1 best solutions below

3
On BEST ANSWER

I tried to recreate this scenario but I was always getting the expected result 1.

Here is the screen shot:

enter image description here

Here is the project, try running this and see if you still face the problem.

PS. I was using xcode 5.1 and iPhone 64bit Simulator.

=============UPDATE================

Here are some explanations on the lldb commands you used.

po : prints the objective C description of an object.

print / p : Evaluates a generalized expression in the current frame. Specify return type of the function if not used in your program.

Hope this screenshot will help you understand more.

enter image description here