I have an iOS app that I created with Xcode 4.0’s “Window-based Application” template. It worked fine back then and it was using the iOS 4.3 SDK. This is an app that simply puts the buttons, labels, etc. directly onto a window. No view controllers—no nothing.
But now that I’ve upgraded to Xcode 4.2 (and its iOS 5.0 SDK), and I run the app, this message gets logged to the console when the app launches in the simulator:
“Applications are expected to have a root view controller at the end of application launch”
To be sure, the app continues to work, but this rather bothersome log gets printed out on every launch.
Why is this happening? Why does iOS 5.0 prefer/request view controllers?
I don't know specifically why the message is logged, but integration between UIWindow and UIViewController has been increasing over the last several iterations of iOS. iOS 4 added a rootViewController
property to UIWindow. The two classes work together to manage view rotation. Given the new capabilities that iOS 5 introduced to UIViewController (specifically, the ability to create your own container view controllers), it's clear that the relationship between the two classes will continue to evolve. As you've said, your app continues to function in iOS 5, so having a root view controller isn't a hard and fast requirement yet. Perhaps there are features planned for future iOS versions that will depend on having a view controller available.
I don’t have anything against them, and I will use them if iOS wants me to. I was just curious about the above behavior.
I'd interpret the logged message as a gentle but persistent nudge from Apple toward providing a root view controller. Most apps already use view controllers anyway, so this isn't a big change, but there are probably a number of apps out there that don't properly set the window's rootViewController
property to their top-level view controller.