Creating AR Multiplayer App in Unity
Using EdgeMultiplay and Unity AR Foundation, you can easily create an Augmented Reality Multiplayer Game that works on Android & iOS
Unity AR Foundation utilizes (iOS ARKit & Android AR Core) to make creating crossPlatform Augmented Reality Apps easier with less configuration.
EdgeMultiplay adds another layer of abstraction to enable creation of Augmented Reality Multiplayer experiences with less configuration and ease of use.
The problem with Augmented Reality Multiplayer games is the global position vs local position, Since AR Camera position is driven by the actual Location of the app user (AR Pose Driver or Tracked Pose Driver), the position of the World Origin and Syncing the Players and other GameObjects becomes challenging.
In EdgeMultiplay, you can define your own world origin and spawn the players relative to this world origin transform. Also, when it comes to syncing players position and rotation you can sync their local position and/or rotation relative to your own world origin.
In EdgeMultiplay we include an example of AR Multiplayer app (AR PingPong)
Download EdgeMultiplay package from here
If you encountered any problem with the next steps, feel free to reach us on https://discord.com/invite/CHCWfgrxh6.
Once you have EdgeMultiplay in your Project you need to do the following:
1.Add AR Foundation to your Unity Project
2.Configure XR Plugin Management
Add AR Foundation to your Unity Project
There is different AR Foundation versions compatible with different Unity versions
For reference, here is a list of the compatibility table from AR Foundation Unity Github
This app sample have been created using AR Foundation 4.0 and Unity 2020.2
You can Add AR Foundation using Unity Package Manager
Configure XR Plugin Management Settings
You can find XR Plugin Management Settings under Project Settings, XR Plugin Management is used to add XR (Mixed Reality) native plugins to your Unity project, you will need AR Kit for iOS or ARCore for Android.
For iOS enable AR Kit
For Android enable AR Core
You’ll need the Camera permission description and to satisfy the minimum API level for ARKit or AR Core.
Once you figure out the steps above, you can safely import the AR PingPong example, which is a running example of AR Multiplayer Game.
Now, you can open the Scene in the AR PingPong Example, add to the Build Settings and Build the project to your phone.
One of the hurdles with AR Multiplayer Games is testing your development, for this you can run a version in Unity Editor and another on your phone.
Unity Editor Input:
You can press “A” button on your keyboard (equivalent to placing an AR Anchor on the phone)
You can press left and right buttons on your keyboard to move the paddle (equivalent to touch movement in your phone)
Understanding this AR Multiplayer Sample:
The game logic is the following:
- Place your AR Environment in the desired position
- Once the AR Environment is placed start the connection to the server and wait for a player to join.
- Once a player joins your room, A GameStart event is received the game goes on.
How it works:
Once you place the AR Environment using the TapToPlace.cs, AREnvironmentPlaced() is called on the GameManager.cs and it does the following:
- Starts the connection to EdgeMultiplay Server.
- Assign the new EdgeManager.WorldOriginTransform to the AR anchor placed.
- Then you wait for OnGameStart callback(Once another player joins your room) and EdgeManager creates the player at the specified relative local position and local rotation in EdgeManager.SpawnInfo
Notice that the two paddles (players) and the ball have EdgeMultiplayObserverLocal attached to them which is responsible for the movement synchronization.
If you have a problem making this work or would like more samples feel free to reach out to us on our discord server https://discord.com/invite/CHCWfgrxh6 .