⭐️⭐️⭐️⭐️⭐️ AR Foundation Remote 2.0 – Published By Kyrylo Kuzyk

AR Foundation Remote 2.0 is a big update to AR Foundation Editor…


Image Tracking, GPS, Augmented Reality, simulator, ARKit, AR Foundation, AR, Face Tracking, arcore, Debug, Simulation, Emulator

Short Description:

AR Foundation Remote 2.0 is a big update to AR Foundation Editor…

Rating: ⭐️⭐️⭐️⭐️⭐️

Discount: None

Asset Title: AR Foundation Remote 2.0

Publisher: Kyrylo Kuzyk

Category: tools, utilities

More Details about this asset:

AR Foundation Remote 2.0 is a big update to AR Foundation Editor Remote: the most popular debugging tool for AR apps. With new exclusive features, you’ll be able to iterate even faster and deliver high-quality AR apps to the market with greater confidence.

In simple words: AR Foundation Remote 2.0 = Unity Remote + AR Foundation + Input System (New) + So Much More.

💡 Current workflow with AR Foundation 💡

1. Make a change to your AR project.

2. Build project to a real AR device.

3. Wait for the build to complete.

4. Wait a little bit more.

5. Test your app on a real device using only Debug.Log().

🔥 Improved workflow with AR Foundation Remote 2.0 🔥

1. Setup the AR Companion app once. The setup process takes less than a few minutes.

2. Just press play! Run and debug your AR app with full access to scene hierarchy and all object properties right in the Editor!

⚡ Features ⚡

• Precisely replicates the behavior of a real AR device in Editor.

• Extensively tested with ARKit and ARCore.

Plug-and-play: no additional scene setup is needed, just run your AR scene in Editor with AR Companion running. Extensively tested with scenes from AR Foundation Samples repository.

• Streams video from Editor to real AR device so you can see how your app looks on it without making a build (see Limitations).

Multi-touch input remoting: stream multi-touch from AR device or simulate touch using a mouse in Editor (see Limitations).

• Test Location Services (GPS), Gyroscope, and Compass right in the Editor.

• Written in pure C# with no third-party libraries or native code. Adds no performance overhead in production. Full source code is available.

• Connect any AR Device to Windows PC or macOS via Wi-Fi: iOS + Windows PC, Android + macOS… any variation you can imagine!

• Compatible with Wikitude SDK Expert Edition.

🎥 Session Recording and Playback 🎥

Session Recording and Playback feature will allow you to record AR sessions to a file and play them back in the reproducible environment (see Limitations).

• Record and playback all supported features: face tracking, image tracking, plane tracking, touch input, you name it!

Fix bugs that occur only under some specific conditions. Playing a previously recorded AR session in the reproducible environment will help you track down and fix bugs even faster!

• Record testing scenarios for your AR app. Your testers don’t have to fight over testing devices ever again: record a testing scenario once, then play it back as many times as you want without an AR device.

⚓️ ARCore Cloud Anchors ⚓️

Testing ARCore Cloud Anchors don’t have to be that hard. With a custom fork adapted to work with the AR Foundation Remote 2.0, you can run AR projects with ARCore Cloud Anchors right in the Unity Editor.

Host Cloud Anchors.

Resolve Cloud Anchors.

Record an AR session with ARCore Cloud Anchors and play it back in the reproducible environment.

🕹 Input System (New) support 🕹

Version 2.0 brings Input System (New) support with all benefits of input events and enhanced touch functionality.

Input Remoting allows you to transmit all input events from your AR device back to the Editor. Test Input System multi-touch input right in the Editor!

• Test Input Actions right in the Editor without making builds.

Record all Input System events to a file and play them back in the reproducible environment. Again, all supported features can be recorded and played back!

⚡ Supported AR subsystems

ARCore Cloud Anchors: host and resolve ARCore Cloud Anchors.

Meshing (ARMeshManager): physical environment mesh generation, ARKit mesh classification support.

Occlusion (AROcclusionManager): ARKit depth/stencil human segmentation, ARKit/ARCore environment occlusion (see Limitations).

Face Tracking: face mesh, face pose, eye tracking, ARKit Blendshapes.

Body Tracking: ARKit 2D/3D body tracking, scale estimation.

Plane Tracking: horizontal and vertical plane detection, boundary vertices, raycast support.

Image Tracking: supports mutable image library and replacement of image library at runtime.

Depth Tracking (ARPointCloudManager): feature points, raycast support.

Camera: camera background video (see Limitations), camera position and rotation, facing direction, camera configurations.

CPU images: camera and occlusion CPU images support (see Limitations).

Anchors (ARAnchorManager): add/remove anchors, attach anchors to detected planes.

Session subsystem: Pause/Resume, receive Tracking State, set Tracking Mode.

Light Estimation: Average Light Intensity, Brightness, and Color Temperature; Main Light Direction, Color, and Intensity; Exposure Duration and Offset; Ambient Spherical Harmonics.

Raycast subsystem: perform world-based raycasts against detected planes, point clouds, and the depth map.

Object Tracking: ARKit object detection after scanning with scanning app (see Limitations).

ARKit World Map: full support of ARWorldMap. Serialize the current world map, deserialize the saved world map and apply it to the current session.

💬 Forum thread

📩 Contact developer

Leave a Reply