Building a floating HoloLens ‘info screen’ - 2: adding the C# behaviours

10 minute read

Intro

In the first installment of this 2-part blog post we have created the UI of the info screen, now we are going to build the dynamics.

  • First we will make the app recognize a speech command
  • Then we will add the dynamics to make the screen appear and disappear when we want.
  • Then we will make the close button work
  • As a finishing touch we will add some spatial sound.

Adding a speech command – the newest new way

For the second or maybe even the third time since I have started using the HoloToolkit, the was the way speech commands are supposed to work has changed. The keyword manager is now obsolete, you now have to use SpeechInputSource and SpeechInputHandler.

First, we add a Messenger, as already described in this blog post, to the Managers game object. It sits in Assets/HoloToolkitExtensions/Scripts/Messaging.

Then, since we are good boy scouts that like to keep things organized, a create a folder “Scripts” under “Assets/App”. In “Scripts” we add a “Messages” folder, and in that we create the following highly complicated message class ;)

public class ShowHelpMessage
{
}

In Scripts we create the SpeechCommandExectutor, which is simply this:

using HoloToolkitExtensions.Messaging;
using UnityEngine;

public class SpeechCommandExecutor : MonoBehaviour
{
    public void OpenHelpScreen()
    {
        Messenger.Instance.Broadcast(new ShowHelpMessage());
    }
}

Add this SpeechCommandExecutor to the Managers game object. Also add a SpeechInputSource script from the HoloToolkit, click they tiny plus-button on the right and add “show help” as keyword:

imageimage




Also, select a key in “key shortcut”. Although they Unity3D editor supports voice commands, you can now also use a code to test the flow. And believe me – your colleagues will thank you for that. Although lots of my colleagues are now quite used to me talking to devices and gesturing in empty air, repeatedly shouting at a computer because it was not possible to determine if there’s a bug in the code or the computer just did not hear you… is still kind of frowned upon.

Anyway. To connect the SpeechCommandExecutor to the SpeechInputSource we need a SpeechInputHandler. That is also in the HoloToolkit. So drag it out of there into the Managers objects. Once again you have to click a very tiny plus-button:

image

And then the work flow is a follows

image

  1. Check the “Is Global Listener” checkbox (that is there because of a pull request by Yours Truly)
  2. Select the plus-button under “Responses”
  3. Select “Show help” from the keyword drop down
  4. Drag the Managers object from the Hierachy to the box under “Runtime only”
  5. Change “Runtime only” to “Editor and Runtime”
  6. Select “SpeechCommandExecutor” and then “OpenHelpScreen” from the right dropdown.

To test you have done everything ok:

In Assets/App/Scripts, double-click SpeechCommandExecutor.

image

This will open Visual Studio, on the SpeechCommandExecutor. Set a breakpoint on

Messenger.Instance.Broadcast(new ShowHelpMessage());

Hit F5, and return to Unity3D. Click the play button, and press “0”, or shout “Show help” if you think that’s funny (on my machine, speech recognition in the editor does not work on most occasions, thus I am very happy with the keys options).

If you have wired up everything correctly, the breakpoint should be hit. Stop Visual Studio and leave Unity Play Mode again. This part is done.

Making the screen follow your gaze

Another script from my HoloToolkitExtensions, that I already mentioned in some form, is MoveByGaze. It looks like this:

using UnityEngine;
using HoloToolkit.Unity.InputModule;
using HoloToolkitExtensions.SpatialMapping;
using HoloToolkitExtensions.Utilities;

namespace HoloToolkitExtensions.Animation
{
    public class MoveByGaze : MonoBehaviour
    {
        public float MaxDistance = 2f;

        public float DistanceTrigger = 0.2f;

        public float Speed = 1.0f;

        private float _startTime;
        private float _delay = 0.5f;

        private bool _isJustEnabled;

        private Vector3 _lastMoveToLocation;

        public BaseRayStabilizer Stabilizer = null;

        public BaseSpatialMappingCollisionDetector CollisonDetector;

        // Use this for initialization
        void Start()
        {
            _startTime = Time.time + _delay;
            _isJustEnabled = true;
            if (CollisonDetector == null)
            {
                CollisonDetector = new DefaultMappingCollisionDetector();
            }
        }

        void OnEnable()
        {
            _isJustEnabled = true;
        }

        // Update is called once per frame
        void Update()
        {
            if ( _isBusy || _startTime > Time.time)
                return;

            var newPos = LookingDirectionHelpers.GetPostionInLookingDirection(2.0f, 
                GazeManager.Instance.Stabilizer);
            if ((newPos - _lastMoveToLocation).magnitude > DistanceTrigger || _isJustEnabled)
            {
                _isJustEnabled = false;
                var maxDelta = CollisonDetector.GetMaxDelta(newPos - transform.position);
                if (maxDelta != Vector3.zero)
                {
                    _isBusy = true;
                    newPos = transform.position + maxDelta;
                    LeanTween.moveLocal(gameObject, transform.position + maxDelta, 
                        2.0f * maxDelta.magnitude / Speed).setEaseInOutSine().setOnComplete(MovingDone);
                    _lastMoveToLocation = newPos;
                }
            }
        }

        private void MovingDone()
        {
            _isBusy = false;
        }

        private bool _isBusy;

    }
}

This is an updated, LeanTween (in stead of iTween) based version of a thing I already described before in this post so I won’t go over it in detail. You will find it in the Animation folder of the HoloToolkitExtensions in the demo project. It uses helper classes BaseSpatialMappingCollisionDetector, DefaultMappingCollisionDetector and SpatialMappingCollisionDetector that are also described in the same post – these are in the HoloToolkitExtensions/SpatialMapping folder of the demo project.

The short workflow, for if you don’t want to go back to that article:

  • Add a SpatialMappingCollisionDetector to the Plane in the HelpHolder
  • Add a MoveByGaze to the HelpHolder itself
  • Drag the InputManager on top of the “Stabilizer” field in the MoveByGaze script
  • Drag the Plane on top of the “Collision Detector” field

The result should look like this

image

I would suggest updating “Speed” to 2.5 because although the screen moves nice and fluid, the default value is a bit slow for my taste. If you now press the Play Button in Unity, you will see the screen already following the gaze cursor if you move around with the mouse or the keyboard.

The only thing is, it is not always aligned to the camera. For that, we have the LookAtCamera script I already wrote about in October in part 3 of the HoloLens airplane tracker app, but I will show it here anyway:

using UnityEngine;

namespace HoloToolkitExtensions.Animation
{
    public class LookatCamera : MonoBehaviour
    {
        public float RotateAngle = 180f;

        void Update()
        {
            gameObject.transform.LookAt(Camera.main.transform);
            gameObject.transform.Rotate(Vector3.up, RotateAngle);
        }
    }
}

because it’s so small. The only change between this and the earlier version is that you know can set the the rotate angle in the editor ;). Drag it on top of the HelpHolder now the screen will always face the user after moving to a place right in front of it.

Fading in/out the help screen

In the first video you can see the screen fades nicely in on the voice command, and out when it’s clicked. The actual fading is done by no less than three classes, two of whom are inside the HoloToolkitExtensions. First is this simple FadeInOutController, that is actually usable all by itself:

using UnityEngine;

namespace HoloToolkitExtensions.Animation
{
    public class FadeInOutController : MonoBehaviour
    {
        public float FadeTime = 0.5f;

        protected bool IsVisible { get; private set; }
private bool _isBusy; public virtual void Start() { Fade(false, 0); } private void Fade(bool fadeIn, float time) { if (!_isBusy) { _isBusy = true; LeanTween.alpha(gameObject, fadeIn ? 1 : 0, time).setOnComplete(() => _isBusy = false); } } public virtual void Show() { IsVisible = true; Fade(true, FadeTime); } public virtual void Hide() { IsVisible = false; Fade(false, FadeTime); } } }

So this is a pretty simple behaviour that fades the current gameobject in or out, in a configurable timespan, and it makes sure it will not get interrupted while doing the fade. Also – notice it initially fades the gamobject out in zero time, so initially any gameobject with this behavior will be invisible

Next up is BaseTextScreenController, that is a child class of FadeInOutController:

using UnityEngine;
using System.Collections;
using System.Collections.Generic;
using System.Linq;

namespace HoloToolkitExtensions.Animation
{
    public class BaseTextScreenController : FadeInOutController
    {
        private List<MonoBehaviour> _allOtherBehaviours;

        // Use this for initialization
        public override void Start()
        {
            base.Start();
            _allOtherBehaviours = GetAllOtherBehaviours();
            SetComponentStatus(false);
        }

        public override void Show()
        {
            if (IsVisible)
            {
                return;
            }
            SetComponentStatus(true);
            var a = GetComponent<AudioSource>();
            if (a != null)
            {
                a.Play();
            }
            base.Show();
        }

        public override void Hide()
        {
            if (!IsVisible)
            {
                return;
            }
            base.Hide();
            StartCoroutine(WaitAndDeactivate());
        }

        IEnumerator WaitAndDeactivate()
        {
            yield return new WaitForSeconds(0.5f);
            SetComponentStatus(false);
        }
    }
}

So this override, on start, gathers all other behaviors, then de-activates components (this will be explained below). When Show is called, it first activates the the components, then tries to play a sound, then calls the base Show to unfade the control. If Hide is called, it first calls the base fade, then after a short wait starts to de-activate all components again.

So what is the deal with this? The other two missing routines are like this:

private List<MonoBehaviour> GetAllOtherBehaviours()
{
    var result = new List<Component>();
    GetComponents(result);
    var behaviors = result.OfType<MonoBehaviour>().Where(p => p != this).ToList();
    GetComponentsInChildren(result);
    behaviors.AddRange(result.OfType<MonoBehaviour>());
    return behaviors;
}

private void SetComponentStatus(bool active)
{
    foreach (var c in _allOtherBehaviours)
    {
        c.enabled = active;
    }
    for (var i = 0; i < transform.childCount; i++)
    {
        transform.GetChild(i).transform.gameObject.SetActive(active);
    }
}

As you can see, the first method simply finds all behaviors in the gameobject – the screen - and its immediate children, except for this behavior. If you supply “false” for “active”, it will first disable all behaviours (except the current one), and then it will set all child gameobjects to inactive. The point of this is that we have a lot of things happening in this screen. It’s following your gaze, checking for collisions, it’s spinning a button, and it’s waiting for clicks – all in vain as the screen is invisible. So this setup makes the whole screen dormant, disables all behaviors except the current one – and also can bring it back ‘to life’ again by supplying ‘true’. The important part is to do the right order (first the behaviours, then the gameobjects). It’s also important to gather the behaviours at the start, because once gameobjects are deactivated, you can’t get to their behaviors anymore.

The final class does nearly nothing – but this is the only app-specific class

using HoloToolkitExtensions.Animation;
using HoloToolkitExtensions.Messaging;

public class HelpTextController : BaseTextScreenController
{
    public override void Start()
    {
        base.Start();
        Messenger.Instance.AddListener<ShowHelpMessage>(ShowHelp);
    }

    private void ShowHelp(ShowHelpMessage arg1)
    {
        Show();
    }
}

Basically the only thing this does is make sure the Show method is called when a ShowHelpMessage is received. If you drag this HelpTextController on top of the HelpHolder and press the Unity play button, you see an empty screen in stead of the help screen. But if you press 0 or yell “show help” the screen will pop up.

Closing the screen by a button tap

So now the screen is initially invisible, it appears on a speech command – now how do we get rid of it again? With this very simple script the circle is closed:

using HoloToolkit.Unity.InputModule;
using UnityEngine;

namespace HoloToolkitExtensions.Animation
{
    public class CloseButton : MonoBehaviour, IInputClickHandler
    {
        private void Start()
        { }

        void Awake()
        {
            gameObject.SetActive(true);
        }

        public void OnInputClicked(InputClickedEventData eventData)
        {
            var h = gameObject.GetComponentInParent<BaseTextScreenController>();
            if (h != null)
            {
                h.Hide();
                var a = gameObject.GetComponent<AudioSource>();
                if (a != null)
                {
                    a.Play();
                }
            }
        }
    }
}

This a standard HoloToolkit IInputClickHandler – when the user clicks, it tries to find a BaseTextScreenController in the parent and calls the Hide method, effectively fading out the screen. And it tries to play a sound, too.

Some finishing audio touches

Two behaviours – the CloseButton and the BaseTextScreenController – try to play sound when they are activated. As I have stated multiple times before, having immediate audio feedback when a HoloLens image‘understands’ a user initiated action is vital, especially when that action’s execution may take some time. At no point you want the user to have a ‘huh it’s not doing anything’ feeling.

In the demo project I have included two audio files I use quite a lot – “Click” and “Ready”. “Click” should be added to the Sphere in HelpHolder. That is easily done by dragging it onto the Sphere from App/Scripts/Audio onto the Sphere. That will automatically create an AudioSource.

Important are the following settings:

  • Check the “Spatialize” checkbox
  • Uncheck the “Play on awake checkbox
  • Move the “Spatial Blend” slider all the way to the right
  • In the 3D sound settings section, set “Volume Roloff” to “Custom Rolloff”

Finally, drag “Ready” on top of the HelpHolder itself, where it will be picked up by the HelpTextController (which is a child class of BaseTextScreenController ) and apply the same settings. Although you might consider not using spatial sound here, because it’s not a sound that is particularly attached to a location – it’s a general confirmation sound

Conclusion

To be honest, a 2d-ish help screen feels a bit like a stopgap. You can also try to have a kind of video of audio message showing/telling the user about the options that are available. Ultimately you can think of an intelligent virtual assistant that teach you the intricacies of an immersive app. With the advent of ‘intelligent’ bots and stuff like LUIS it might actually become possible to have an app help you through it’s own functionality by having a simple questions-and-answers like conversation with it. I had quite an interesting discussion about this subject at Unity Unite Europe last Wednesday. But then again, since Roman times we have pointed people in right directions or conveyed commercial messages by using traffic signs and street signs – essentially 2D signs in a 3D world as well. Sometimes we used painted murals, or even statue like things. KISS sometimes just works.

The completed demo project can be downloaded here.