General Information
There are some system requirements to develop for HoloLens, so the features of the laptop which is used for this study are:
• 64-bit operating system
• Windows 10 Pro
• Visual Studio 2017
• 16GB RAM
• DirectX 12
The interface is designed in Unity Game Engine version 2017.4.1f1. This version is recommended for HoloLens development during the time of the development process in the first half of the year 2018.
In the beginning, MixedRealityToolkit-Unity package is downloaded from GitHub and imported into the Unity project. This package contains a toolkit necessary for HoloLens and other MR gadgets development. The capabilities of devices are listed in the player settings. The capabilities enabled for this project are InternetClientServer, PrivateNetworkClientServer and Spatial Perception. Spatial Perception is critical for Spatial Mapping feature of the HoloLens. Also, Spatial Mapping asset from the toolkit should also be added to the project along with the Input Manager for gaze and gesture settings. The distance to the nearest visible object is suggested as 85 cm, but during the development process, it is changed to 10 cm to give the sense of touching the holograms to the user.
Some basic scripts are used and modified for each component in the interface (App.A, Figure 17, Table 7).
SceneContentAdjuster:
In the development process, the interface elements are combined in one object called “Scene Content” to be able to add the SceneContentAdjuster (Figure 18). It is a code to align all scene content with the height of the user. Since all objects in the scene are connected with a button logic, they are all layered in the scene under the first object “menu”.
Figure 17: Main scripts used for each component in the interface.
Table 7: Key interaction scripts.
MoveWithObject:
Menu object contains main-menu button, move and stop holograms. MoveWithObject script is attached to the menu object to follow the user head movement (Figure 18). In that way, if the user unlocks the home, the menu moves with the user.
Figure 18: SceneContentAdjuster and MoveWithObject scripts.
InteractiveToggle:
All buttons on the main-menu have the InteractiveToggle script to remain pushed when the user taps on them and to be released when the user tap on them for the second time (Figure 19). Mood options are also working as toggle-buttons to allow easy on and off system. The user can activate all the particles at the same time and mix them.
Figure 19: InteractiveToggle script view on Unity.
Under the build button, there are four primitives with the script “InstantiateObject” attached. This script enables creating new primitives in the scene. Each primitive has “ColorTransition”, “CycleColor”, and “CycleClicker” scripts; thus, users can change the colors of them by air tap. There is also “TwoHandManipulatable” script on them to allow the move, rotation and scale by air tap and hold.
Mood button also has four sub-menu elements including four mood options. These mood options are also working as toggle-buttons to allow easy on and off system. Each mood option has “InteractiveToggle” script to be able to activate particle type attached. Rainy mood has rain particles, the snowy mood has snow particles, heart rain mood has many hearts particle, and fireworks mood has starts particle. The user can activate all the particles at the same time and mix them. Alternatively, they can deactivate the ones they want to close by pushing their buttons.
Four animals are placed under the animals' button with the script “InstantiateObject” attached as in the same logic with the category “build”. So, when the user taps on an animal, that animal is instantiated in front of the user. “Interactive” script and an animator are attached to each animal to enable animation and sound feedback when the user tap on. Each animal has unique animation and voice related to its type. Again, there is “TwoHandManipulatable” script on them to allow the move, rotation, and scale by air tap and hold.
Place button contains only one sample room to show the possibilities of the interface. It works similar to other elements in the interface. The user can instantiate the room by tapping on it, and move, rotate and resize it by air tap and hold gesture.
When the delete button is activated, it means “AllDeleteButtons” script is running, and all delete icons hide in the objects appear on the holograms in the scene. Users can tap on the icons they want to remove from the scene. After the deletion process is completed, the user can deactivate the delete button by tapping on it again to run “AllDeleteButtonsInactive” script. So, the delete icons disappear.
Mute button has two scripts attached: muteButton and unmuteButton. When it is activated, “muteButton” script is running and all the sounds in the interface is muted. Also, when it is deactivated, “unmuteButton” script is running, and the sounds are working as usual.