#StackBounty: #augmented-reality #arcore Visualize the 3D feature map associated with ARCore Cloud Anchors

Bounty: 50

I am developing an application using ARCore Cloud Anchors. My question is related to the document describing that, when you host an anchor, it creates a 3D feature map of the environment and binds it to the anchor location and stores it.

Question: Is it possible to visualize the 3D feature map associated with ARCore Cloud Anchors on the application side? I want to know it because, by understanding quantitatively where exactly to scan, the app is able to help the general users to resolve anchors more smoothly.

The documentation also says find an area with visual features that are easily distinguishable from each other, but I don’t think it is quantitative enough to derive the best way to scan an area.


Get this bounty!!!

#StackBounty: #json #swift #xml #augmented-reality #arkit ARKit 4.0 – Is it possible to convert ARWorldMap data to JSON file?

Bounty: 50

I’d like to know whether it is possible to convert a worldMap binary data (that stores a space-mapping state and set of ARAnchors) to json or xml file?

func writeWorldMap(_ worldMap: ARWorldMap, to url: URL) throws {

    let data = try NSKeyedArchiver.archivedData(withRootObject: worldMap, 
                                         requiringSecureCoding: true)
    try data.write(to: url)
}

If this possible, what tools can I use for that?


Get this bounty!!!

#StackBounty: #swift #scenekit #augmented-reality #arkit #hittest Place multiple SCN objects in touchesBegan method

Bounty: 50

My Swift code below uses func touchesBegan to place a SCN object in the ARKit view. The problem is – it’s only placing the object one time. I would like to create the code in a way that users can select any area to place the SCN object and it can place it as many times as they want too.

Here’s GitHub link.

enter image description here

class ViewController: UIViewController, ARSCNViewDelegate {

    @IBOutlet var sceneView: ARSCNView!

    override func touchesBegan(_ touches: Set<UITouch>, with event: UIEvent?) {  
        // Handle the shooting
        guard let frame = sceneView.session.currentFrame else { return }
        let camMatrix = SCNMatrix4(frame.camera.transform)
        let direction = SCNVector3Make(-camMatrix.m31 * 5.0, -camMatrix.m32 * 10.0, -camMatrix.m33 * 5.0)
        let position = SCNVector3Make(camMatrix.m41, camMatrix.m42, camMatrix.m43)
        let scene = SCNScene(named: "art.scnassets/dontCare.scn")!
        sceneView.scene = scene
    }  
}  


Get this bounty!!!

#StackBounty: #android #ios #game-development #artificial-intelligence #augmented-reality Library for creating an augmented reality app…

Bounty: 100

I’m trying to build an application similar to Wanna Kicks. I’m looking for a recommendation for AR library that will enable me to display 3D objects on top of human legs.

I guess that the library I’m looking for will give me the ability to train the application on a predefined images/modules of the human leg, allow me to detect the angle/rotation & position of the leg and place the 3D object of top of the leg image.

I worked before with SDKs like Vuforia, ARCore & ARKit, and they doesn’t provide the ability to track the kind of trackers I’m talking about. They can detect markers of static 2d and 3d objects with static shapes and features.

I’m also aware that I mostly won’t find a library that will satisfy all the requirements I’m looking to fulfill. I’m also looking for options that will do the dirty work for me(training, detection), but in the same time they support that I can add the other missing requirements for it(detecting position & rotation).

What are my options for an augmented reality library or non-Augmented reality library but still can do the job?

I’m looking for a library which runs on Android, ios or both of them.


Get this bounty!!!

#StackBounty: #ios #swift #augmented-reality #arkit #qlpreviewcontroller iOS: Issue with lightning in USDZ file

Bounty: 300

I am trying to show an AR content with QLPreviewController. Everything works fine except the lighting. If I preview the file using Xcode or macOS’s quick look the lighting is natural, but when I preview it using QLPreviewController the object is too dark!.
Is there any possible way to adjust the lighting, scale, and other settings?


Get this bounty!!!

#StackBounty: #ios #augmented-reality #arkit #compass-geolocation Rotate SCNNode to North without permission for location tracking

Bounty: 100

We would like to rotate a SCNNode so as to match with the North direction.

Arkit allows to align its z-axis with the North but this requires user permission for location tracking, and we don’t want to ask for it.
see gravityandheading

As a consequence, we’re trying to use the magnetic field property of CMDeviceMotion. But we have no idea how to do that.
There are probably some matrix calculations, but we don’t master them for the time being.
see magneticfield

Any help would be highly appreciated! thanks!


Get this bounty!!!

#StackBounty: #touch #virtual-reality #augmented-reality #calibration Mapping a Touchinput-device in a 3D World for VR

Bounty: 100

I have a Touch-Input device which is linked to my Unity Program and a VR Device.
The Touch-Input device which is bolted somewhere BUT the position can be changed.

I can get the “2D Coordinates” X / Y from the touch input device (from 0 to 1)

For each of those 2D coordinates I also have a 3D-VR Coordinate , grabbed by a LEAP or gloves.
(Those coordinates are not always the corner ones.)

Here the problem on a wonderful paint art:

enter image description here

So what I have is the 2d and 3d positions of my fingertip (on the touch and inside the 3d world

And what I need is the a 3d Object (for example Cube) with correct rotation and position.

Bonus points for Scale!


Get this bounty!!!