Normally, when running an iOS App the main UIWindow is black and takes up the entire devices screen. It is possible to have more than one UIWindow but the usual case for this is when content is being displayed on a secondary display, e.g. AirPlay. On MacOS it is common to have many instances of UIWindow with these being displayed on the Mac's desktop. It's normal to think of an iOS device as having a desktop as when using an App (or multiple if using Multi-tasking) then the desktop/background is completely obscured.
None the less, UIWindow is a subclass of UIView. This inherently means it has a Frame, i.e. a position in it's parent space plus a width & height. I wondered if these could be altered - they can!
I wrote a very simple single page App which contains a UIView that by default has the screen dimensions and is coloured blue, on top this, roughly in the centre is a red rectangle (a child UIView).
There are a set of 3 gesture recognisers (pinch, rotate & pan) that can be applied to either the Red Rect, the View or the underlying Window. Which one they're applied too is set using the Picker control hosted by the UIView.
The next picture shows all 3 components resized, rotated and moved around. Note how the Status Bar has almost disappeared due the black text colour now being on a black background (though the green battery indicator is still present) as no UIWindow is there.
By dragging the Window over the Status Bar it's revealed again.
Next, the Red Rectangle has been completely detached from the it's parent UIView. At this point touch events do not seem to be sent to it any more.
Taking this further my moving (via the pan gesture) the View from off the Window this too stops receiving input. At this one nothing can be done with the App.
That's not completely true. The App can be rotated. This has some bizarre results probably not helped by there being no constraints.
The code for the App is available on GitHub though the code for ViewController is below
The only slightly strange things happening here are:
- The re-assignment of the Gesture Recognizers to the selected UIView
- For each of the methods invoked by the Gesture Recognizers how they 'reset' the main parameter modified by the gesture after each invocation.
If you run the project, as the there are multiple Gesture Recognizers but multiple simultaneous Gesture Recognition support hasn't been enabled then in particular pinch & rotate must be discreet gestures. Also, rotation can be fiddly to get working especially if UI element has been pinched to a small size.
I'm not really aware of any practical application of this. It might be interesting to try have 2 UIWindows displayed at the same time though I'm not sure this is possible and could easily be achieved with 2 UIViews. Perhaps to have 2 sub-apps each in its own UIWindow which can be swapped between.