Building an Augmented Reality Ruler with ARKit

Posted by Aaron John on August 16, 2018

In this project I made a Augmented Reality Ruler which relies on the built in sensors and A9 processor (Available through iPhone 6S and upwards.) The measurements are based on the plane detection’s capabilities of ARKit.

Here’s how I built the iOS app:

  1. First import the necessary libraries: UIKit, ARKit, and SceneKit. Here, the ARKit library handles all the tracking and analyzing(tracks the phone’s position in the real world), while the SceneKit library renders 3D virtual objects on top of the camera’s image. But if you have used Unity or Unreal Engine then it would be nice to use those libraries as your rendering engine.
    1
    2
    3
    
    import UIKit
    import ARKit
    import SceneKit
    
  2. I used ARSCNViewDelegate to interact with SceneKit to use the augmented reality features of ARKit. The SceneKit I chose (between ARSCNView and SCNView) for this project was ARSCNView. This ARSCNView renders the scenes UI objects’ session(to start a new session when the app is opened) and configuration(config,scene,etc.) Here, we use ARWorldTrackingConfiguration to track the devices movement.
    1
    2
    3
    4
    5
    6
    7
    8
    9
    10
    11
    12
    13
    14
    15
    16
    17
    18
    19
    20
    21
    22
    23
    24
    25
    26
    27
    28
    29
    30
    31
    
    class ViewController: UIViewController, ARSCNViewDelegate {
    
     @IBOutlet var indicator: UIImageView!       //Outlets that are connected to StoryBoard
     @IBOutlet var placeButton: UIButton!
     @IBOutlet var trashButton: UIButton!
     @IBOutlet var sceneView: ARSCNView!
    
     var center : CGPoint!
    
     override func viewDidLoad() {
         super.viewDidLoad()
         sceneView.delegate = self
         center = view.center
         sceneView.scene.rootNode.addChildNode(arrow)
         sceneView.autoenablesDefaultLighting = true
     }
    
     override func didRotate(from fromInterfaceOrientation: UIInterfaceOrientation) {
         center = view.center
     }
    
     override func viewWillAppear(_ animated: Bool) {
         super.viewWillAppear(animated)
         let configuration = ARWorldTrackingConfiguration()
         sceneView.session.run(configuration)
     }
    
     override func viewWillDisappear(_ animated: Bool) {
         super.viewWillDisappear(animated)
         sceneView.session.pause()
     }
    
  3. The Next important part of the project is the SCNNode object. Everything inside the scene, including the scene is represented as an instance of this object. These instances can have other properties which may include geometry (meaning 3D models), position of object and and its children, etc. Here, the core concept is that we descirbe objects as nodes(This is a data structure used to manage all the content inside SceneKit.)
    1
    2
    3
    4
    5
    6
    7
    8
    
     var startNode: SCNNode!         //variables for setting up Nodes
     var endNode: SCNNode!
     var lineNode: SCNNode?
     var textNode: SCNNode!
     var textWrapNode: SCNNode!
     var positions = [SCNVector3]()
     var isFirstPoint = true
     var points = [SCNNode]()
    
  4. The function renderer is called when ARKit detects a surface in the scene view which could be used as an “anchor” and added to the scene view. (An “anchor” is a special type of node which could be used to position a virtual object in a way that it will make the object seem like it is actually in the real world. example: Wall, floor, chair, table etc.) Here, ARKit and SceneKit work seamlessly together. When ARKit finds an anchor in the realworld, it adds it to the scene and SceneKit calls the renderer function to see if there are more instructions to be added to the anchor node.
    1
    2
    3
    4
    5
    6
    7
    8
    9
    10
    11
    
    func renderer(_ renderer: SCNSceneRenderer, updateAtTime time: TimeInterval) {
         let hitTest = sceneView.hitTest(center, types: .featurePoint)
         let result = hitTest.last
         guard let transform = result?.worldTransform else {return}
         let thirdColumn = transform.columns.3
         let position = SCNVector3Make(thirdColumn.x, thirdColumn.y, thirdColumn.z)
         positions.append(position)
         let lastTenPositions = positions.suffix(10)
         arrow.position = getAveragePosition(from: lastTenPositions)
    
     }
    
  5. Here, I wrote a function to get the average center position as to where the camera is pointing.
    1
    2
    3
    4
    5
    6
    7
    8
    9
    10
    11
    12
    13
    
      func getAveragePosition(from positions : ArraySlice<SCNVector3>) -> SCNVector3 {
         var averageX : Float = 0
         var averageY : Float = 0
         var averageZ : Float = 0
    
         for position in positions {
             averageX += position.x
             averageY += position.y
             averageZ += position.z
         }
         let count = Float(positions.count)
         return SCNVector3Make(averageX / count , averageY / count, averageZ / count)
     }
    
  6. Here, I move on to add buttons for the user view. I started off with a “plus” button to enable measuring. So, when the user taps on this button it starts the measuremernt process by adding a small white sphere to the scene view (3D space) as the starting position (point A.) Next, when the user moves the camera to the end position, they should tap on the “plus” button again to end measurement. By doing so, another small white sphere is added to the scene view (3D space) as the end position (point B.) Following this, a line connects these two spheres in the scene view and it provides a red sphere at the midpoint of this line, while also showing the distance between the two spheres in cm above this midpoint. (Note: This function is connected to step 9.)
    1
    2
    3
    4
    5
    6
    7
    8
    9
    10
    11
    12
    13
    14
    15
    16
    17
    18
    19
    20
    21
    22
    23
    24
    25
    26
    27
    28
    29
    30
    31
    32
    33
    34
    35
    36
    37
    38
    39
    40
    41
    42
    43
    44
    45
    46
    47
    48
    49
    50
    51
    52
    
    @IBAction func placeAction(_ sender: UIButton) {
    
         let sphereGeometry = SCNSphere(radius: 0.005)
         let sphereNode = SCNNode(geometry: sphereGeometry)
         sphereNode.position = arrow.position
         sceneView.scene.rootNode.addChildNode(sphereNode)
         points.append(sphereNode)
    
         if isFirstPoint {
             isFirstPoint = false
         } else {
             //calculate the distance
             let pointA = points[points.count - 2]
             guard let pointB = points.last else {return}
    
             let d = distance(float3(pointA.position), float3(pointB.position))
    
             //add line
                 let line = SCNGeometry.lined(from: pointA.position, to: pointB.position)
                 print(d.description)
                 let lineNode = SCNNode(geometry: line)
                 sceneView.scene.rootNode.addChildNode(lineNode)
    
    
             // add midPoint
             let midPoint = (float3(pointA.position) + float3(pointB.position)) / 2
             let midPointGeometry = SCNSphere(radius: 0.003)
             midPointGeometry.firstMaterial?.diffuse.contents = UIColor.red
             let midPointNode = SCNNode(geometry: midPointGeometry)
             midPointNode.position = SCNVector3Make(midPoint.x, midPoint.y, midPoint.z)
             sceneView.scene.rootNode.addChildNode(midPointNode)
    
             // add text
    
             let textGeometry = SCNText(string: String(format: "%.0f", d * 100) + "cm" , extrusionDepth: 1)
             let textNode = SCNNode(geometry: textGeometry)
             textNode.scale = SCNVector3Make(0.005, 0.005, 0.01)
             textGeometry.flatness = 0.2
             midPointNode.addChildNode(textNode)
    
    
             // Billboard contraints
             let contraints = SCNBillboardConstraint()
             contraints.freeAxes = .all
             midPointNode.constraints = [contraints]
    
    
             isFirstPoint = true   
    
         }
    
     }
    
  7. Next, I added a button to delete all AR nodes that were created on screen, if the user wanted to clear scene view.
    1
    2
    3
    4
    5
    6
    7
    
     @IBAction func deleteAction(_ sender: UIButton) {
    
         sceneView.scene.rootNode.enumerateChildNodes { (node, stop) in
             node.removeFromParentNode()
         }
    
     }
    
  8. Next, I added a button to toggle a flashlight, so that the user can use this app to take measurements in the dark.
    1
    2
    3
    4
    5
    6
    7
    8
    9
    10
    11
    12
    13
    14
    15
    16
    17
    18
    19
    20
    21
    22
    23
    
    @IBAction func toggleTorch(_ sender: UIButton) {
    
         guard let device = AVCaptureDevice.default(for: AVMediaType.video)
             else {return}
    
         if device.hasTorch {
             do {
                 try device.lockForConfiguration()
    
                 if device.torchMode == .on {
                     device.torchMode = .off
                 } else {
                     device.torchMode = .on
                 }
    
                 device.unlockForConfiguration()
             } catch {
                 print("Torch could not be used")
             }
         } else {
             print("Torch is not available")
         }
     }
    
  9. Here, I extended a function called lined to my class (used in step 6.) The function is what helps create the line geometry between Point A and Point B spheres.
    1
    2
    3
    4
    5
    6
    7
    8
    
    extension SCNGeometry {
     class func lined(from vectorA : SCNVector3, to vectorB : SCNVector3) -> SCNGeometry {
         let indices : [Int32] = [0,1]
         let source = SCNGeometrySource(vertices: [vectorA, vectorB])
         let element = SCNGeometryElement(indices: indices, primitiveType: .line)
         return SCNGeometry(sources: [source], elements: [element])
     }
    }
    

You can get my AR Ruler iOS app work on GitHub. In conclusion the AR Ruler works pretty well. It is not perfect in some situations, such as in low lighting or when a surface is not entirely flat. Hence, the results wont be completely accurate all the time and since ARkit is still in its Beta Phase you’re better off using a real ruler for now, to measure anything requiring high accuracy.