skip to Main Content

I would like to develop a desktop app for MacOS that uses computer camera with RealityKit to display 3D cube in front of camera at some distance.
I’ve started a project for MacOS using RealityKit, and immediately noticed that it is not using MacOS camera , not even asking for camera access permission and same project work fine for IOS.
On MacOs it showing cube with black background.

Is the same functionality not available on both MacOS and iOS, or am I just doing something wrong?

We are expecting that app will ask permission to access camera and object will get visualise in front of camera.

2

Answers


  1. Unlike Unity AR apps, that can run on desktop computers, RealityKit apps for macOS don’t have the ARKit part (there are no accel/gyro/faceid/lidar hardware sensors), so no tracking/scene understanding capabilities are included in RealityKit’s macOS API. RealityKit desktop apps are considered VR apps.

    Based on the aforementioned, it is not difficult to guess that there is no ARSession object in the RealityKit API for macOS.

    struct ARContainer: NSViewRepresentable {
        let arView = ARView(frame: .zero)
    
        func makeNSView(context: Context) -> ARView {
    
            print(arView.session.identifier)              // Error in macOS
            return arView
        }
    }
    
    Login or Signup to reply.
  2. 1.Please Check if you have given camera permission inside your info.plist file.
    2.Please check whether your camera is properly connected or not to desktop computer.

    Login or Signup to reply.
Please signup or login to give your own answer.
Back To Top
Search