Create a gist now

Instantly share code, notes, and snippets.

iOS touch delay workaround for Unity, caused by 3D Touch app switcher. Put the two files in the Assets/Plugins/iOS folder of your Unity project and build like normal.
#pragma once
#include "UnityAppController.h"
#include "UnityView.h"
This is a workaround for touches delayed by about 1s near the left
screen edge on iOS devices with 3D Touch
The issue is due to the global app switcher, which is invoked with
a force press near the left screen edge. iOS delays sending touch
events to views until the gesture fails to trigger.
Apple is unlikely to fix the issue as the 3D Touch task switcher
has been removed in iOS 11.
This workaround registers its own gesture recognizer, which is not
affected by the touch delay. To simplify matters, all touch input
is handled by the recognizer, no touches are sent to the view
as long as it's registered.
The workaround will only activate when needed, on devices that
support 3D Touch and only on iOS 10 and earlier.
btw, orb gesture is apparently the internal name for the 3D Touch
task switcher.
@interface TouchFixGestureRecognizer : UIGestureRecognizer
- (id)initWithTarget:(id)target action:(SEL)action;
@interface TouchFixView : UnityView <UIGestureRecognizerDelegate> {
TouchFixGestureRecognizer *fix;
@interface TouchFixAppController : UnityAppController
#import "OrbGestureFix.h"
#import <UIKit/UIGestureRecognizerSubclass.h>
#include "UnityAppController.h"
/* The gesture recognizer used to intercept touches.
* Only by never setting the gesture recognizer's state can we
* make sure the touches are not delivered to Unity's view but
* still stay associated with it.
* If we set the state to began, touches will be disassociated
* from the view and Unity will stop accepting them.
* If we set that state to ended or cancelled, the delayed
* touches will be delivered to Unity's view.
@implementation TouchFixGestureRecognizer
-(id)initWithTarget:(id)target action:(SEL)action
if ((self = [super initWithTarget:target action:action])) {
return self;
- (void)touchesBegan:(NSSet<UITouch *> *)touches withEvent:(UIEvent *)event
[super touchesBegan:touches withEvent:event];
UnitySendTouchesBegin(touches, event);
- (void)touchesMoved:(NSSet<UITouch *> *)touches withEvent:(UIEvent *)event
[super touchesMoved:touches withEvent:event];
UnitySendTouchesMoved(touches, event);
- (void)touchesEnded:(NSSet<UITouch *> *)touches withEvent:(UIEvent *)event
[super touchesEnded:touches withEvent:event];
UnitySendTouchesEnded(touches, event);
- (void)touchesCancelled:(NSSet<UITouch *> *)touches withEvent:(UIEvent *)event
[super touchesCancelled:touches withEvent:event];
UnitySendTouchesCancelled(touches, event);
/* The UnityView subclass that registers our gesture recognizer.
* We use traitCollectionDidChange because the trait collection is not yet
* initialized when the view is created. It also has the benefit that the
* workaround is enabled/disabled when the user enables/disables 3D Touch.
@implementation TouchFixView
- (void)traitCollectionDidChange:(UITraitCollection *)previousTraitCollection
[super traitCollectionDidChange:previousTraitCollection];
// The 3D Touch task switcher has been removed in iOS 11,
// only activate workaround on earlier versions
if ([[[UIDevice currentDevice] systemVersion] floatValue] < 11) {
UIForceTouchCapability cap = [[self traitCollection] forceTouchCapability];
if (cap == UIForceTouchCapabilityAvailable && fix == nil) {
fix = [[TouchFixGestureRecognizer alloc] initWithTarget:self action:nil];
[self addGestureRecognizer:fix];
} else if (cap == UIForceTouchCapabilityUnavailable && fix != nil) {
[self removeGestureRecognizer:fix];
fix = nil;
/* We override the app controller to use our UnityView subclass.
@implementation TouchFixAppController
- (UnityView*)createUnityView
return [[TouchFixView alloc] initFromMainScreen];
// This tells Unity to use our app controller subclass
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment