Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

How to scan all geometry with texture? #16

Open
masaldana2 opened this issue Apr 20, 2021 · 11 comments
Open

How to scan all geometry with texture? #16

masaldana2 opened this issue Apr 20, 2021 · 11 comments

Comments

@masaldana2
Copy link

it only seems to be applying to one frame and one part of the mesh not all.
i.e.
IMG_98A6385A0CEF-1

@LLLaiYoung
Copy link

I also found this problem. I scanned a large area. When the texture map was finally displayed, only the texture map of the current frame was displayed normally, and the content beyond the current frame of the picture would have an edge texture trailing effect. I suspect it is texture. The problem of coordinate calculation. I checked what I could find on the Internet, but there was no solution to this problem.

I tried to display the realistic texture map by default, which looks okay when running, but the exported .usdz file will also have the above-mentioned edge texture trailing effect.

code show as below:

func renderer(_ renderer: SCNSceneRenderer, nodeFor anchor: ARAnchor) -> SCNNode? {
    guard scanMode == .noneed else {
        return nil
    }
    guard let anchor = anchor as? ARMeshAnchor else { return nil }

    let node = SCNNode()
    let geometry = scanGeometory(anchor: anchor, node: node)
    node.geometry = geometry
    
    return node
}

func renderer(_ renderer: SCNSceneRenderer, didUpdate node: SCNNode, for anchor: ARAnchor) {
    guard scanMode == .noneed else {
        return
    }
    guard let anchor = anchor as? ARMeshAnchor else { return }
    let geometry = self.scanGeometory(anchor: anchor, node: node)
    node.geometry = geometry
}


func scanGeometory(anchor: ARMeshAnchor, node: SCNNode) -> SCNGeometry {
    let frame = sceneView.session.currentFrame!

    let geometry = SCNGeometry(geometry: anchor.geometry, modelMatrix: anchor.transform, textureCoordinates: anchor.geometry.calcTextureCoordinates(camera: frame.camera, modelMatrix: anchor.transform)!)
    if let image = captureCamera() {
        geometry.firstMaterial?.diffuse.contents = image
    }
    node.geometry = geometry

    return geometry
}


func exportUSDE() {
    let fileName = "Mesh" + UUID().uuidString
    let documentDirURL = try! FileManager.default.url(for: .documentDirectory,
                                                      in: .userDomainMask,
                                                      appropriateFor: nil,
                                                      create: true)
    let fileURL = documentDirURL.appendingPathComponent(fileName).appendingPathExtension("usdz")
    
    self.sceneView.scene.write(to: fileURL, options: nil, delegate: nil, progressHandler: nil)
    let activityVc = UIActivityViewController(activityItems: [fileURL], applicationActivities: nil)
    
    DispatchQueue.main.async {
        activityVc.popoverPresentationController?.sourceView = UIView(frame: CGRect(x: 100, y: 100, width: 66, height: 88))
        self.present(activityVc, animated: true)
    }
}

The effect is as follows:
image

@brunsy
Copy link

brunsy commented Aug 4, 2021

The issue occurs when updating the nodes (from the didUpdate)

@LLLaiYoung
Copy link

The issue occurs when updating the nodes (from the didUpdate)

Hello, I still don’t know how to solve this problem. Are there any examples?

@edcacoustics
Copy link

Did anyone solve this? I see what you mean bronzy about it not updating the view when there is a didUpdate, but I am not sure how to refresh the object texture. I can't seem to see the mesh texture being saved at all, and seems to be only displaying live at the time of render. Can anyone confirm this? Is there a way to save the texture from the camera so it can be recalled?

@jaswant-iotric
Copy link

Went through the code and this isnt an issue of texture mapping. The texture mapping works as expected. Looks like the didUpdate throws nodes that are not in the frame too hence resulting in this trailing edge effect.
One way to solve this is to get the nodes that are within the frame and apply texture there (Dont know how to do that).

Let's open a discussion and get this done with. Its been too long that everyone is trying to find an open source solution for the same!

@jaswant-iotric
Copy link

I went ahead and did some experiments and this code did a little better. Not useable though. Please have a look

`//
// ScanViewController.swift
// ExampleOfiOSLiDAR
//
// Created by TokyoYoshida on 2021/02/10.
//

import RealityKit
import ARKit

class LabelScene: SKScene {
let label = SKLabelNode()
var onTapped: (() -> Void)? = nil

override public init(size: CGSize){
    super.init(size: size)

    self.scaleMode = SKSceneScaleMode.resizeFill

    label.fontSize = 65
    label.fontColor = .blue
    label.position = CGPoint(x:frame.midX, y: label.frame.size.height + 50)

    self.addChild(label)
}

required init?(coder aDecoder: NSCoder) {
    fatalError("Not been implemented")
}

convenience init(size: CGSize, onTapped: @escaping () -> Void) {
    self.init(size: size)
    self.onTapped = onTapped
}

override func touchesBegan(_ touches: Set<UITouch>, with event: UIEvent?) {
    if let onTapped = self.onTapped {
        onTapped()
    }
}

func setText(text: String) {
    label.text = text
}

}
class ScanViewController: UIViewController, ARSCNViewDelegate, ARSessionDelegate {
enum ScanMode {
case noneed
case doing
case done
}

@IBOutlet weak var sceneView: ARSCNView!
var scanMode: ScanMode = .noneed
var originalSource: Any? = nil

let globalTestNode = SCNNode()

lazy var label = LabelScene(size:sceneView.bounds.size) { [weak self] in
    self?.rotateMode()
}

override func viewDidLoad() {
    func setARViewOptions() {
        sceneView.scene = SCNScene()
    }
    func buildConfigure() -> ARWorldTrackingConfiguration {
        let configuration = ARWorldTrackingConfiguration()

        configuration.environmentTexturing = .automatic
        configuration.sceneReconstruction = .mesh
        if type(of: configuration).supportsFrameSemantics(.sceneDepth) {
            configuration.frameSemantics = .smoothedSceneDepth
        }

        return configuration
    }
    func setControls() {
        label.setText(text: "Scan")
        sceneView.overlaySKScene = label
    }
    super.viewDidLoad()
    sceneView.delegate = self
    sceneView.session.delegate = self
    setARViewOptions()
    let configuration = buildConfigure()
    sceneView.session.run(configuration)
    setControls()
}

func rotateMode() {
    switch self.scanMode {
    case .noneed:
        self.scanMode = .doing
        label.setText(text: "Reset")
        originalSource = sceneView.scene.background.contents
        sceneView.scene.background.contents = UIColor.black
    case .doing:
        break
    case .done:
        scanAllGeometry(needTexture: false)
        self.scanMode = .noneed
        label.setText(text: "Scan")
        sceneView.scene.background.contents = originalSource
    }
}

func renderer(_ renderer: SCNSceneRenderer, nodeFor anchor: ARAnchor) -> SCNNode? {
    guard scanMode == .noneed else {
        return nil
    }
    guard let anchor = anchor as? ARMeshAnchor ,
          let frame = sceneView.session.currentFrame else { return nil }

    let node = SCNNode()
    guard let frame = self.sceneView.session.currentFrame else { return node}
    guard let anchor = anchor as? ARMeshAnchor else { return node}
    if(self.nodeIsWithinFrame(node: node)) {
        guard let cameraImage = captureCamera() else {return node}
        let geometry = self.scanGeometory(frame: frame, anchor: anchor, node: node, needTexture: true, cameraImage: cameraImage)
        node.geometry = geometry
        globalTestNode.addChildNode(node)
    }

    return node
}

func renderer(_ renderer: SCNSceneRenderer, didUpdate node: SCNNode, for anchor: ARAnchor) {
    guard scanMode == .noneed else {
        return
    }
    guard let frame = self.sceneView.session.currentFrame else { return }
    guard let anchor = anchor as? ARMeshAnchor else { return }
    if(self.nodeIsWithinFrame(node: node)) {
        guard let cameraImage = captureCamera() else {return}
        let geometry = self.scanGeometory(frame: frame, anchor: anchor, node: node, needTexture: true, cameraImage: cameraImage)
        node.geometry = geometry
        globalTestNode.addChildNode(node)
    }
}

func renderer(_ renderer: SCNSceneRenderer, updateAtTime time: TimeInterval) {
    if (self.scanMode == .doing) {
        self.scanAllGeometry(needTexture: true)
        self.scanMode = .done
    }
}

func nodeIsWithinFrame(node: SCNNode) -> Bool {
    if let pointOfView = sceneView.pointOfView {
        let position = sceneView.projectPoint(node.worldPosition)
        var bounds = UIScreen.main.bounds
        var mBounds = CGRect(x: bounds.midX, y: bounds.midY, width: bounds.width, height: bounds.height)
        
        if mBounds.contains(CGPoint(x: CGFloat(position.x), y: bounds.size.height - CGFloat(position.y))) {
            return true
        }
    }
    return false
}

func scanGeometory(frame: ARFrame, anchor: ARMeshAnchor, node: SCNNode, needTexture: Bool = false, cameraImage: UIImage? = nil) -> SCNGeometry {
    
    let camera = frame.camera

    let geometry = SCNGeometry(geometry: anchor.geometry, camera: camera, modelMatrix: anchor.transform, needTexture: needTexture)

    if let image = cameraImage, needTexture {
        geometry.firstMaterial?.diffuse.contents = image
    } else {
        geometry.firstMaterial?.diffuse.contents = UIColor(red: 0.5, green: 1.0, blue: 0.0, alpha: 0.7)
    }
    node.geometry = geometry

    return geometry
}

func scanAllGeometry(needTexture: Bool) {
    
    
    self.sceneView.scene.rootNode.addChildNode(globalTestNode)
    return
    
    guard let frame = sceneView.session.currentFrame else { return }
    guard let cameraImage = captureCamera() else {return}

    guard let anchors = sceneView.session.currentFrame?.anchors else { return }
    let meshAnchors = anchors.compactMap { $0 as? ARMeshAnchor}
    for anchor in meshAnchors {
        guard let node = sceneView.node(for: anchor) else { continue }
        let geometry = scanGeometory(frame: frame, anchor: anchor, node: node, needTexture: needTexture, cameraImage: cameraImage)
        node.geometry = geometry
    }
}

func captureCamera() -> UIImage? {
    guard let frame = sceneView.session.currentFrame else {return nil}

    let pixelBuffer = frame.capturedImage

    let image = CIImage(cvPixelBuffer: pixelBuffer)

    let context = CIContext(options:nil)
    guard let cameraImage = context.createCGImage(image, from: image.extent) else {return nil}

    return UIImage(cgImage: cameraImage)
}

}

extension UIColor {
static var random: UIColor {
return UIColor(
red: .random(in: 0...1),
green: .random(in: 0...1),
blue: .random(in: 0...1),
alpha: 1.0
)
}
}
`

@jaswant-iotric
Copy link

Also on apple's developer's forums I found this

If you want to color the mesh based on the camera feed, you could do so manually, for example by unprojecting the pixels of the camera image into 3D space and color the according mesh face with the pixel's color. However, keep in mind that ARMeshAnchors are constantly updated. So you might want to first scan the entire area you're interested in, then stop scene reconstruction, and do the coloring in a subsequent step.

@WayneChou1
Copy link

Hi,DId you solve this proble? @jaswant-iotric

@farhodyusupov
Copy link

Hi, is there someone find a solution for this issue.

@Chaman-Sahu-iOS
Copy link

Hi, Is there any updates on this?

@jaswant-iotric
Copy link

jaswant-iotric commented Aug 7, 2024

I've found the solution. It's using one library called RTABMAP. It provides solution for Android, iOS and iOT . I've implemented it in my iOS app here https://apps.apple.com/in/app/ar-scale/id1671790051

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

8 participants