I would think that to target an sdk, the compile sdk would have to be at or higher than the target sdk. Can someone please help me understand what I am missing?
I recently changed the target sdk to be higher than the compile sdk and the app still worked. I do not understand how this is possible.
Here's a long post from one of the Android devs explaining the whole thing, what you should and shouldn't change, and why!
There's this little formula about the relationship between the different numbers:
minSdkVersion <= targetSdkVersion <= compileSdkVersionBut I think the point there is this is how it should be. The
minSdkacts as a minimum API level, andcompileSdkis the version of the API that the app is actually compiled against, making it an effective maximum "included" API level (can't add features that aren't in it yet!).targetSdkis meant to act as a control for features - you can (and should) compile against the most recent API for bug fixes etc, but actual behaviour changes can be restricted by limitingtargetSdkto an earlier API level:So really it works like "don't enable any features from APIs higher than this". In which case, it doesn't really make sense to set it higher than
compileSdk- there aren't any extra features available beyond the API level you're compiling against. Plus you shouldn't be raisingtargetSdkwithout testing against that API level - which you obviously haven't, if you're not compiling against that API!So I guess there's nothing stopping you from setting
targetSdkat whatever level you want, it's reallycompileSdksetting that upper limit on what actually gets into the app. There just isn't any good reason to settargetSdkhigher than that.