You will find it in the Animation folder of the HoloToolkitExtensions in the demo project. This is an updated, LeanTween (in stead of iTween) based version of a thing I already described before in this post so I won’t go over it in detail. LeanTween.moveLocal(gameObject, transform.position + maxDelta,Ģ.0f * maxDelta.magnitude / Speed).setEaseInOutSine().setOnComplete(MovingDone) Var maxDelta = CollisonDetector.GetMaxDelta(newPos - transform.position) If ((newPos - _lastMoveToLocation).magnitude > DistanceTrigger || _isJustEnabled) Var newPos = LookingDirectionHelpers.GetPostionInLookingDirection(2.0f, Public BaseSpatialMappingCollisionDetector CollisonDetector ĬollisonDetector = new DefaultMappingCollisionDetector() Public BaseRayStabilizer Stabilizer = null Namespace HoloToolkitExtensions.Animation Using HoloToolkitExtensions.SpatialMapping Making the screen follow your gazeĪnother script from my HoloToolkitExtensions, that I already mentioned in some form, is MoveByGaze. Stop Visual Studio and leave Unity Play Mode again. If you have wired up everything correctly, the breakpoint should be hit. Click the play button, and press “0”, or shout “Show help” if you think that’s funny (on my machine, speech recognition in the editor does not work on most occasions, thus I am very happy with the keys options). Set a breakpoint on (new ShowHelpMessage()) This will open Visual Studio, on the SpeechCommandExecutor. In Assets/App/Scripts, double-click SpeechCommandExecutor. Select “SpeechCommandExecutor” and then “OpenHelpScreen” from the right dropdown.Change “Runtime only” to “Editor and Runtime”.Drag the Managers object from the Hierachy to the box under “Runtime only”.Select “Show help” from the keyword drop down.Select the plus-button under “Responses”.Check the “Is Global Listener” checkbox (that is there because of a pull request by Yours Truly). Once again you have to click a very tiny plus-button: So drag it out of there into the Managers objects. To connect the SpeechCommandExecutor to the SpeechInputSource we need a SpeechInputHandler. Although lots of my colleagues are now quite used to me talking to devices and gesturing in empty air, repeatedly shouting at a computer because it was not possible to determine if there’s a bug in the code or the computer just did not hear you… is still kind of frowned upon.Īnyway. And believe me – your colleagues will thank you for that. Although they Unit圓D editor supports voice commands, you can now also use a code to test the flow. Also add a SpeechInputSource script from the HoloToolkit, click they tiny plus-button on the right and add “show help” as keyword:Īlso, select a key in “key shortcut”. (new ShowHelpMessage()) Īdd this SpeechCommandExecutor to the Managers game object. Public class SpeechCommandExecutor : MonoBehaviour In Scripts we create the SpeechCommandExectutor, which is simply this: using HoloToolkitExtensions.Messaging In “Scripts” we add a “Messages” folder, and in that we create the following highly complicated message class ) public class ShowHelpMessage Then, since we are good boy scouts that like to keep things organized, a create a folder “Scripts” under “Assets/App”. It sits in Assets/HoloToolkitExtensions/Scripts/Messaging. The keyword manager is now obsolete, you now have to use SpeechInputSource and SpeechInputHandler.įirst, we add a Messenger, as already described in this blog post, to the Managers game object.
0 Comments
Leave a Reply. |
AuthorWrite something about yourself. No need to be fancy, just an overview. ArchivesCategories |