Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

In mirror 93.0.1 and unity 6000.0.24f1 MirrorClient and MirrorServer are not found #40

Open
Vince577 opened this issue Feb 8, 2025 · 7 comments

Comments

@Vince577
Copy link

Vince577 commented Feb 8, 2025

When I imported Univoice and the samples in the group voice call sample the "GROUP VOICE CALL SAMPLE" game objects scripts class cannot be found because the #if and #end if and 2 errors pop up saying MirrorClient and MirrorServer cannot be found. I have everything correctly imported and UNIVOICE_MIRROR in the Scripting Define Symbols.

Image
Image
Image

@adrenak
Copy link
Owner

adrenak commented Feb 8, 2025

Hi @Vince577 the symbol required is UNIVOICE_MIRROR_NETWORK not UNIVOICE_MIRROR. Let me know if that fixes it! I haven't tried on Unity6 but I don't expect it to make any difference.

@Vince577
Copy link
Author

Vince577 commented Feb 8, 2025

@adrenak , It fixed the errors but another question is how do i set it up in the player

@adrenak
Copy link
Owner

adrenak commented Feb 8, 2025

You mean you want to sound to come from avatars of the players? You'd need two things:

  • Be able to get the avatar gameobjects of other clients in the game (there are multiple ways of doing this that can be found online)
  • Parent the audio source to be a child of the avatar. Make the sound play from the avatar direction.

I'll write a guide on this and add a sample soon. But here's the general approach.

On each client you need to be able to access the avatar of the other clients. This is required so that when they join, you can parent the AudioSource that plays the audio of that client to their avatar.

Once you have a system in place that allows you to access the avatar GameObject of other clients in the game, you can use client.OnPeerJoined event that lets you know when another peer joins the audio chat. Here's the part of the sample scene that uses it for creating a UI element for example.

You can access the output of each client using client.PeerOutputs. So the code initially becomes this:

client.OnPeerJoined += id => {
    var output = client.PeerOutputs[id];  // This gets you the output of the peer
};

But, the output object above is of type IAudioOutput, you need to cast it to StreamedAudioSourceOutput. Then you can access the AudioSource component that is playing the audio of that peer. Once done, you need to parent it to the avatar of the client that has joined. Then you set spatialBlend to 1 and the minDistance and maxDistance to define how far their sound can travel. The code then becomes something like this:

client.OnPeerJoined += id => {
    // Cast the output so that we can access the AudioSource that's playing this newly joined peers audio
    var output = client.PeerOutputs[id] as StreamedAudioSourceOutput;  
    var audioSource = output.Stream.UnityAudioSource;

    var peerAvatar = SomeMethodThatGivesYouTheAvatarUsingConnId(id);  // You'll need to create this method somewhere. It would allow you to access the gameObject of the other clients that are in the game. 
    audioSource.transform.SetParent(peerAvatar.transform);  // parent the audiosource to the avatar
    audioSource.transform.localPosition = Vector3.zero;  // set the position to the avatar root

    audioSource.spatialBlend = 1;  // We set a spatial blend of 1 so that the audio is positional
    audioSource.maxDistance = 25; // Let the audio of this peer travel to upto 25 meters
};

Let me know if this works!

@Vince577
Copy link
Author

Vince577 commented Feb 10, 2025

@adrenak On this line it says client does not contain a definition for PeerOutputs

var output = client.PeerOutputs[id] as StreamedAudioSourceOutput;

Also do I have to add anything to the player gameobject? This line goes in the start void or the update void right? Would this script go on the player? And this would allow players to hear other players?

@adrenak
Copy link
Owner

adrenak commented Feb 10, 2025

My bad! It's session.PeerOutputs as defined here

All this code is something you want in a UniVoiceSetup or UniVoiceManager in a Start method. This code isn't specific to a player prefab. It's like a configuration for everything that UniVoice does throughout the game and only needs to run once.

For example, the InitializeSession method in the samples just runs once at the start of the game in the main GroupVoiceCallMirrorSample class. The event handlers that are defined in that method take care of everything as they happen.

@Vince577
Copy link
Author

@adrenak Players can't hear other players voices at all, do I have to add a special script to the player prefab along with the audio source, or could it be that it might be using a different microphone, or both? I'm using parallel sync and a kcp transport if that helps. Also all players are the prefab. Also the same thing is happening in the Group chat sample. Also instead of parenting a audio source to the player or whatever could I instead just get the audio source the player comes with?

@adrenak
Copy link
Owner

adrenak commented Feb 12, 2025

@Vince577 since there's no tutorial/doc for this yet, I might need some more information to help you out based on what your current usage looks like. You can add me on Discord adrenak#1934

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

2 participants