ARKit detecting planes and placing objects
This Article covers the ARKit plane detection and placing the objects on the plane. It’s written in Swift 4 in Xcode 9 beta version.
Here is the screenshot of the output of the project explained in this article.
Complete movie demo of the code is uploaded here. All the code in this article can be downloaded / cloned from github.
Little Introduction:
ARKit is the iOS framework for Augmented Reality. ARKit using the built-in camera, powerful processors and motion sensors in iOS devices to track the real world objects and let virtual objects blend in with real world environment. ARKit uses Visual Inertial Odometry (VIO) to accurately track the world around it. It supports Unity, Unreal, and Scenekit to display AR content
Before diving into coding, here is the brief description of objects to know.
- ARSession : This class configures and runs various AR techniques on devices. It reads the objects through the cameras using the motion sensing techniques and provides the session instance for every AR experience built with ARKit.
- SessionConfiguration: ARSession instance runs the session configuration which is either ARSessionConfiguration or it’s subclass ARWorldTrackingSessionConfiguration. This configuration determines how the device’s position and motion is tracked in the real world.
- ARSessionConfiguration: It provides the basic configuration that detects only devices orientation
- ARWorldTrackingSessionConfiguration: It provides the tracking for real-world surfaces and it position while reading the device motion through the camera. It currently provides only horizontal plane /surface detection.
- Views: ARKit provide ARSCNView to display 3D SceneKit content and ARSKView to display 2D SpriteKit content
- ARAnchor: Every object which is node (either SceneKit node or SpriteKitNode) is tagged with ARAnchor object that tracks the real world positioning and orientation. ARPlaneAnchor is the subclass of ARAnchor used to track the real-world flat surfaces (currently ARKit supports only horizontal surfaces). This object holds the width, length and center of the plane.
- ARSCNViewDelegate: This protocol provides various methods to receive the captured images and tracking information. It calls the delegate methods passing ARAnchor object whenever plane is detected, frame size is updated, node is deleted etc. More details provided as we go through code.
Time to code:
It’s certainly easy to understand the objects discussed above when we see the code. Enough of the theory!!
setup:
Open the project code in Xcode. You need atleast Xcode 9 beta (latest at this time of writing the article) to run the project successfully. Please note ARKit does not work on iOS simulators, it runs only on iOS devices with A9 or higher processing chip.
override func viewDidLoad() {
super.viewDidLoad()
// Set the view's delegate
sceneView.delegate = self
// Show statistics such as fps and timing information
sceneView.showsStatistics = true
// Set the scene to the view
self.sceneView.scene = scene
self.sceneView.autoenablesDefaultLighting = true
self.sceneView.debugOptions = [.showConstraints, .showLightExtents, ARSCNDebugOptions.showFeaturePoints, ARSCNDebugOptions.showWorldOrigin]
//shows fps rate
self.sceneView.showsStatistics = true
self.sceneView.automaticallyUpdatesLighting = true
setUpScenesAndNodes()
}
Setup the ARSCNView just like SCNView in SceneKit. Attach SCNScene instance to ARSCNView.
ARSCNDebugOptions showFeaturePoint, showWorldOrigin in the above code displays the points on the real world plane that it tries to detect. More the feature points, easier it is for ARKit to identify the horizontal planes.
Configuration:
var configuration = ARWorldTrackingSessionConfiguration()
func setSessionConfiguration(pd : ARWorldTrackingSessionConfiguration.PlaneDetection, runOPtions: ARSession.RunOption) {
//currenly only planeDetection available is horizontal.
configuration.planeDetection = pd
sceneView.session.run(configuration, options: runOPtions)
}
Create an instance of ARWorldTrackingSessionConfiguration and set its plane detection to ARWorldTrackingSessionConfiguration.PlaneDetection.horizontal . Let sceneView session to run on the configuration with the ARSession.RunOptions.resetTracking.
Plane Detection:
As you point the camera to the horizontal surface, it shows the surface feature points trying to identify the plane. Once it has enough feature points, it recognizes the horizontal surface area and calls below method.
(nullable SCNNode *)renderer:(id )renderer nodeForAnchor:(ARAnchor *)anchor;
Method implementation for this code provided below.
/*Implement this to provide a custom node for the given anchor.
@discussion This node will automatically be added to the scene graph.
If this method is not implemented, a node will be automatically created.
If nil is returned the anchor will be ignored.
@param renderer The renderer that will render the scene.
@param anchor The added anchor.
@return Node that will be mapped to the anchor or nil.
*/
func renderer(_ renderer: SCNSceneRenderer, nodeFor anchor: ARAnchor) -> SCNNode? {
var node: SCNNode?
if let planeAnchor = anchor as? ARPlaneAnchor {
node = SCNNode()
// let planeGeometry = SCNPlane(width: CGFloat(planeAnchor.extent.x), height: CGFloat(planeAnchor.extent.z))
let planeGeometry = SCNBox(width: CGFloat(planeAnchor.extent.x), height: planeHeight, length: CGFloat(planeAnchor.extent.z), chamferRadius: 0.0)
planeGeometry.firstMaterial?.diffuse.contents = UIColor.green
planeGeometry.firstMaterial?.specular.contents = UIColor.white
let planeNode = SCNNode(geometry: planeGeometry)
planeNode.position = SCNVector3Make(planeAnchor.center.x, Float(planeHeight / 2), planeAnchor.center.z)
//since SCNPlane is vertical, needs to be rotated -90 degrees on X axis to make a plane //planeNode.transform = SCNMatrix4MakeRotation(Float(-CGFloat.pi/2), 1, 0, 0)
node?.addChildNode(planeNode)
anchors.append(planeAnchor)
} else {
// haven't encountered this scenario yet
print("not plane anchor \(anchor)")
}
return node
}
ARAnchor object holds all the coordinates necessary to create the SCNNode and tag the real world anchor coordinated to the virtual object created in SceneKit. First typecast ARAnchor to determine if that object is an ARPlaneAnchor. If typecasting is successful, create planeAndhor object. planeAnchor.extent and planeAnchor.center gives the width, length and the center of the plane. This coordinates can be used to create SCNFloor, SCNPlane or and SCNNode that you want to act as a horizontal base. In the code example, we created SCNBox geometry node. Store this anchor as current anchor for any future interaction on it.
Plane Update:
As more features are identified for the ARPlaneAnchor , ARKit sends new anchor coordinates to below method. It’s upto us whether are interested in updated frame of horizontal surface or not.
(void)renderer:(id <SCNSceneRenderer>)renderer didUpdateNode:(SCNNode *)node forAnchor:(ARAnchor *)anchor
Code implemented for update method is provided below.
// Called when a node has been updated with data from the given anchor
func renderer(_ renderer: SCNSceneRenderer, didUpdate node: SCNNode, for anchor: ARAnchor) {
if let planeAnchor = anchor as? ARPlaneAnchor {
if anchors.contains(planeAnchor) {
if node.childNodes.count > 0 {
let planeNode = node.childNodes.first!
planeNode.position = SCNVector3Make(planeAnchor.center.x, Float(planeHeight / 2), planeAnchor.center.z)
if let plane = planeNode.geometry as? SCNBox {
plane.width = CGFloat(planeAnchor.extent.x)
plane.length = CGFloat(planeAnchor.extent.z)
plane.height = planeHeight
}
}
}
}
}
At any moment, we can get the SCNNode tied the specific ARAnchor and vice-versa using below method calls.
/**
Searches the scene hierarchy for an anchor associated with the provided node.
@param node A node in the view's scene.
*/
open func anchor(for node: SCNNode) -> ARAnchor?
/**
Returns the node that has been mapped to a specific anchor.
@param anchor An anchor with an existing node mapping.
*/
open func node(for anchor: ARAnchor) -> SCNNode?
ARHitTestResult:
Any time we want to get the ARAnchor that user touches, we can just do a hitTest on the SceneView on the touched location. This gives an array of ARHitTestResult objects. Get the first result from the array and read the “identifier” property of the ARPlaneAnchor object. This “identifier” property uniquely identifies every anchor.
Place the object:
override func touchesBegan(_ touches: Set, with event: UIEvent?) {
let touch = touches.first!
let location = touch.location(in: sceneView)
if arState == .select {
selectExistinPlane(location: location)
}
if arState == .reset && anchors.count > 0 {
let hitResults = sceneView.hitTest(location, types: .existingPlaneUsingExtent)
if hitResults.count > 0 {
let result: ARHitTestResult = hitResults.first!
let newLocation = SCNVector3Make(result.worldTransform.columns.3.x, result.worldTransform.columns.3.y, result.worldTransform.columns.3.z)
let newLampNode = lampNode?.clone()
if let newLampNode = newLampNode {
newLampNode.position = newLocation
sceneView.scene.rootNode.addChildNode(newLampNode)
}
}
}
}
func selectExistinPlane(location: CGPoint) {
let hitResults = sceneView.hitTest(location, types: .existingPlaneUsingExtent)
if hitResults.count > 0 {
let result: ARHitTestResult = hitResults.first!
if let planeAnchor = result.anchor as? ARPlaneAnchor {
for var index in 0...anchors.count - 1 {
if anchors[index].identifier != planeAnchor.identifier {
sceneView.node(for: anchors[index])?.removeFromParentNode()
}
index += 1
}
anchors = [planeAnchor]
setPlaneTexture(node: sceneView.node(for: anchors[0])!)
}
}
}
After you get the first object in the ARHitResult array, it’s worldTransform columns will provide the real world coordinates of the touch location. Just create a SCNNode of your geometry and set it’s position to those columns values. Add that node to scene view. You are done !!
How exciting is the ARKit!. For any questions, comments, let me know. Happy coding !!!

Passionate about learning new things. Loves coding and problem solving. Built apps from scratch on iOS platform with Swift, Objective – C, HTML 5, javascript, Cordova, Xamarin using both MVC and MVVM. Coded extensively in .Net and Database technologies before moving to mobile development.
Spends free time playing with my kid and watching TV.

