To monitor what goes on in an iOS app, we need to collect all user interactions. This data will give us an insight into the usage patterns of the users.
We need to collect data about different types of user interactions in the app environment, based on apple documentation, we can find that there are 3 different types of user events that can be triggered:
- Touch Events using the glass pane, includes various gestures
- Motion Events using the device’s gyroscope, includes, shake, rotate, etc
- Remote Events using the remote, like volume up/down buttons.
The above classification is based on the input mode of the events however based on the activity performed, the events are classified into different Gestures. Each gesture is a simple or a composite action made by the user which triggers an expected response from the app.
There are many types of gestures and they are handled in iOS by GestureRecognizers. Gesture Recognisers, capture events which match the predetermined gestures and process them. So if you want to process a touch/tap event, you should get the tap event from initialising the UITapGestureRecognizer and passing the handleGesture message.
This is the straight forward implementation, however as a third party, we have to get the gestures from a central point where we can collect data about the gesture events but still not intervene with the regular flow of the app.
Approach 1: Attach a subclass of UIApplication in main.m file.
One way to do this according to this stackoverflow post is to attach a custom implementation of the UIApplication class and passing it to the return statement of the main.m file.
The code in the main.m file would be:
return UIApplicationMain(argc, argv, @"MyApplication", @"MyApplicationDelegate");
where MyApplication is the name of the custom class which subclasses UIApplication. In MyApplication class we need to override the sendEvent message definition and handle a log, then pass the event to the UIApplication, through sendEvent message.
- (void)sendEvent:(UIEvent*)event {
//log the event, call a util class. whatever your app does, then forward to super class
[super sendEvent:event];
}
This works for all events, however I found that some apps have their own extension of UIApplication and thus our inject code cannot stay the same, also since we are getting events and not gestures we get a lot of noise.
For example, we get can get upto two events for a single touch gesture: touch down and touch up, but we want only a single gesture for this event. We may have to get the appropriate event by identifying the touch up event alone, as the touch can be considered complete when the finger moves up after a tap.
Consider a more complex gesture such as a swipe of a pan, each move is considered as a new event and lot of events are generated. Parsing through all of the generated events is non-sensical and time and cpu consuming. Note that we have to complete our task before the event is allowed to do it’s intended action in the app’s interface.
we have to be almost instantaneous or we have to split our code flow into a different processing thread asynchronously, which might not be possible in the UIApplication’s subclass. Now let’s see if approach 2 is any better.
Approach II: Registering our own GestureRecognizers in AppDelegate and then taking the data without disturbing the regular flow of events
From the start this seems less intrusive, a couple of lines in the app delegate and then voila, it should work.
I will post the details of Approach 2 is a separate post.
Comments
Post a Comment