Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Added NVRVirtualHand and NVRVirtualInputDevice to use with NPC and remote users #114

Open
wants to merge 40 commits into
base: master
Choose a base branch
from

Conversation

jbroadway
Copy link
Contributor

Attach NVRVirtualHand to the hands of NPCs and remote users to replay remote interactions over multiplayer. You can control the hand by calling one of three new methods on NVRVirtualHand:

  • Hold() begins holding the closest interactable
  • Release() releases the current interactable
  • Use() triggers UseButtonDown() on the current interactable

…mote users

You can control the hand via 3 new methods on NVRVirtualHand:

- `Hold()` begins holding the closest interactable
- `Release()` releases the current interactable
- `Use()` triggers `UseButtonDown()` on the current interactable
@jbroadway
Copy link
Contributor Author

I added two new events to NVRHand, OnBeginUseInteraction and OnEndUseInteraction, which can be used to capture the trigger down/up and broadcast them. Using these, we now have NewtonVR working over multiplayer, with grab and release and the trigger working, which is very exciting!

@SmartCarrion
Copy link

I haven't inspected your code yet, but judging by the description I'm worried about design for lag. Shouldn't 'Hold' take the interactable to be acted upon as an argument? If it just grabs the closest thing, and lag or physics makes positioning slightly different on the clients, then they could be out of sync and each client shows the player grabbing a different object.

@jbroadway
Copy link
Contributor Author

@SmartCarrion in short, you're right. The Hold() method mainly just wraps PickupClosest() which is what gets called when you try to grab something locally already, so it's basically the same on both sides in that sense. But the On*Interaction events on NVRHand do provide an NVRInteractable so you can send the object that was grabbed and do correction based on that.

Lag is definitely a design challenge in multiplayer VR, but getting NewtonVR to where a virtual hand can grab and use objects is step 1. From here, additional logic will be needed to do prediction and synchronization. It may work to build that into NewtonVR as a next step, but it may be something that ends up needing fine tuning on a case-by-case basis, depending on the type of interaction and gameplay.

This should help ensure virtual hands pick up the same object as the remote player. Note that I also kept the original `Hold()` method for use as an NPC hand controller too.
…ovements being out of sync

This isn't as ideal as client-side prediction and synchronization, but will help ensure objects are still successfully picked up, and those techniques can be built on top of this work later on.
@jbroadway
Copy link
Contributor Author

We added a few new improvements which help improve replay accuracy a little bit, namely a Hold(withObjectID) method to ensure the virtual hand picks up the correct object and not just the closest. This doesn't replace the need for client-side prediction, but those can be built on top of this as next steps.

@jbroadway
Copy link
Contributor Author

I should add that setting an interaction point on the objects makes a big difference to how well this feels :)

@jbroadway
Copy link
Contributor Author

Any news on when this might get incorporated, or on changes needed to do so?

We're planning additional multiplayer-related work on our end, so it would be great to get the basics in sync with the main NewtonVR project before getting too far along in our forked repo.

@Proktologe
Copy link

Hi, i just downloaded the last version from the Asset Store. Wehre can i find the scripts for use it in a Mutliplayer?

@jbroadway
Copy link
Contributor Author

@Proktologe to start implementing multiplayer, you'll need to incorporate the modifications from this pull request into your project, and then you can attach virtual hands (the NVRVirtualHand component) to the objects that represent your remote players' hands in the scene.

From there, you would send the transform data of each user's hands and use it to update the position of each virtual hand, as well as capturing the various OnBeginUseInteraction events on the regular NVRHand components to know when to call Hold(), Release() and Use() on the virtual hands.

This of course won't recreate the interactions perfectly, due to network latency, but there are techniques for keeping things in sync. Here's a great resource on those techniques:

http://gafferongames.com/networking-for-game-programmers/what-every-programmer-needs-to-know-about-game-networking/

@jbroadway
Copy link
Contributor Author

Added one more method, EndUse(), to virtualize releasing the use trigger.

@zite, any ETA on looking into merging this?

jbroadway and others added 6 commits February 27, 2017 16:17
…oo soon

Our user testing has found the following periodic crash on launch:

```
[NewtonVR] Error: NVRInteractables.Register called before initialization.
 
(Filename: C:/buildslave/unity/build/artifacts/generated/common/runtime/UnityEngineDebugBindings.gen.cpp Line: 42)

NullReferenceException: Object reference not set to an instance of an object
  at NewtonVR.NVRInteractables.Register (NewtonVR.NVRInteractable interactable, UnityEngine.Collider[] colliders) [0x00000] in <filename unknown>:0 
  at NewtonVR.NVRInteractable.UpdateColliders () [0x00000] in <filename unknown>:0 
  at NewtonVR.NVRInteractableItem.UpdateColliders () [0x00000] in <filename unknown>:0 
  at NewtonVR.NVRInteractable.Start () [0x00000] in <filename unknown>:0 
  at NewtonVR.NVRInteractableItem.Start () [0x00000] in <filename unknown>:0 
 
(Filename:  Line: -1)

NullReferenceException: Object reference not set to an instance of an object
  at Flipside.Loader.SetAppMode (AppMode newMode) [0x00000] in <filename unknown>:0 
  at Flipside.Loader+<Initialize>c__Iterator36.MoveNext () [0x00000] in <filename unknown>:0 
  at UnityEngine.SetupCoroutine.InvokeMoveNext (IEnumerator enumerator, IntPtr returnValueAddress) [0x00000] in <filename unknown>:0 
 
(Filename:  Line: -1)
```
@jbroadway
Copy link
Contributor Author

I had to merge two other PR's into my fork's master branch for manageability, so they're included in this one now too (an intermittent crash in builds that doesn't happen in editor, and separate interaction points for right- and left-handed use).

Any update on looking at these PR's, @zite? It's getting harder to maintain compatibility over several months with no word, and I'd like to keep contributing since we're actively using NewtonVR with multiplayer over here :)

@zite
Copy link
Member

zite commented Apr 9, 2017

So you're using this for networking? Are you using a plugin or something custom? I was thinking just making a kind of dummy item would work better than trying to stub out a whole nvrplayer replacement.

At a minimum I'll need to be able to test this to make sure it works before accepting a push. But just to let you know I may end up going a different route. Can you provide any other code to help me test? If not that's cool, I can try and put something together myself. Multiplayer was on the roadmap anyway.

@jbroadway
Copy link
Contributor Author

I'm using this for two things:

  • Networked users
  • Replay of recorded user actions

In both cases, I don't use NVRPlayer for them at all (only for the local player), I just attach the NVRVirtualHand to the hand objects that represent remotes and replays and on receiving incoming packets I do one of two things:

  1. For positional data, I apply it to the hand's transform with a bit of lerping (I'm only sending positional data at 30 fps), which doesn't touch NVRVirtualHand at all
  2. For actions like hold, release, etc. I just call the appropriate public method on NVRVirtualHand

A quick test would be:

  • Create a simple mirrored "player" in the scene (you could call it Dummy and put a Head, LeftHand and RightHand under it with spheres on them)
  • Add an NVRVirtualHand to each of the dummy's hands
  • Mirror the NVRPlayer's positional data from the head and hands onto the dummy object on update
  • Connect the events on each NVRHand to the mirrored NVRVirtualHand's Hold(), Release(), Use() and EndUse() methods

Outside of that, any networking logic stays outside of NewtonVR which could use Unity's networking, Photon, Steamworks, or anything else without requiring any changes to NewtonVR itself.

I can see about setting up an example scene like I described for you too if that would be helpful :)

@zite
Copy link
Member

zite commented Apr 9, 2017

Ah gotcha! Thanks for the extra info. I was just planning on adding a separate scene to newton that showed an example of networking the example scene eventually, so figured I'd start here. Most of that wouldn't be built into "main" newton, just an extra example. Probably going to start with photon first unless you have another preference.

@zite
Copy link
Member

zite commented Apr 11, 2017

I've been working on our photon implementation and I'm not seeing a direct use for this. It seems like you should be tracking the location of the items separately anyway, otherwise they don't move when you release them. I'm not rendering remote hands yet, so maybe my opinion will change when I get there. But it seems like having this done locally you're going to get some weird situations.

For replaying user action that seems like it could make sense, but with rigidbodies you're going to get into similar weird situations since things aren't deterministic / lag / other object interaction on the rigidbodies. Might as well just track the objects in the recording too.

I appreciate you contributing, and this has pushed me to implement a multiplayer example, but right now it's not looking like I'm going to merge this into main, probably best left in your fork. However, I am adding some hooks and events in that wouldn't have been there without this request, so thanks for that.

@jbroadway
Copy link
Contributor Author

Either way it works out, I'm glad even if some of it ends up being useful :)

I'm curious how you see the remote players' actions playing out to each other. When we went to implement hands on our remote avatars, we needed a way of programmatically taking an incoming packet that said "user x grabbed object y" and applying that to the hand/interactable on the receiving end. That's where those extra methods on NVRVirtualHand came in for us, since remote users wouldn't have the NVRPlayer or NVRHand that the local player does.

We've also implemented a couple different methods for replaying object movement, depending on the type of object. Some we don't care if they replay exactly and we just apply the velocity, etc. on release. Others we track more precisely.

Anyway, looking forward to seeing your multiplayer example too. Cheers!

Elements added for custom avatars were being duplicated into the
physical hand if the Vive controllers weren't turned on before the
avatar was assigned.
@zite
Copy link
Member

zite commented Apr 20, 2017

Wellll, I'm kind of doing a 180 here. Initially I had object tracking without modifying any of the player bits. But now that I'm trying to render the players too I realize I probably should just do a full set of nvrplayer inheritance down to hand. That way each client can just check .IsAttached on objects, get the hand from the player, things like that. I'm skipping over nvrinputdevice right now because it seems kind of redundant, but we'll see how things go as I progress. Just thought I'd give you an update to let you know I'm still working on it.

@jbroadway
Copy link
Contributor Author

Since our fork has evolved a bit beyond this PR (adding separate left and right hand interaction points), but is being actively maintained for our project, I've added a wiki page here that describes what we've added. Hope that helps in case these changes are useful for others.

John Luxford and others added 16 commits September 11, 2017 15:33
Compares the last snapped position and rotation and the current one and
applies a small weight before rounding so you need to move slightly past
50% before it will snap to the next position or rotation. This way, it
doesn't snap back and forth at the 50% points between snapped positions.
added line : ButtonMapping.Add (NVRButtons.ApplicationMenu, OVRInput.Button.Start);
To get the button working on the left controller,

The sibling button on the right controller is reserved and trigger oculus menu.
NVRInteractableItemSnappable allows items to snap to things
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

Successfully merging this pull request may close these issues.

None yet

8 participants