As prototyping workflows continue to evolve, there are more and more ways to interact with your design. Everything from touch drag and tap gestures to keyboard shortcuts and voice recognition can not only be used but combined to trigger a variety of interactions to support this expanding functionality. We've made a number of improvements to the XD application. Let me show you just a few of them.
I'm here in prototype mode in Adobe XD and I have a number of artboards that I'm working with. What I want to do is allow multiple interactions with this tile to transition to different artboards.
To begin with what I'll do is come in and select this tile on the canvas and then I'm going to select that arrow there on the right-hand side and just press and drag it to the left.
What I'd like to do is show a drag behavior here to simulate a carousel between the tiles on the left and right artboard.
With the object selected here in the properties inspector I can come in and under trigger select drag I'd like to use auto animate for the transition and I do want it to go to that first card.
The next thing I want to do is when you tap on this tile I wanted to take you to this detailed view of the artboard here towards the bottom.
Notice that with that object still selected instead of an arrow i have a plus sign. This is allowing me to add a second interaction. I'll come to the plus sign and press and drag a wire to the detailed view artboard and when I release I want to go on in and change that to a tap gesture, and I want it to auto animate to that Yosemite info detailed page.
Now in addition to tap and drag I'd like to also use speech recognition as a trigger between artboards.
Once again I'll come and impress and drag a wire from that tile to the Yosemite detail artboard and here in the Properties inspector I'm going to come and select voice.
Now one thing you'll notice is drag and tap is no longer in that menu. XD is going to keep track of what I do and make sure that I don't create a conflict. So, for example, I can only have one tap gesture on this object I can have multiple keyboard commands or multiple voice commands so those are still in the menu.
I'll go ahead and select voice and I'll add a command.
Now one thing I'd like to do as part of a usability study is test which sort of voice commands make the most sense to my users. So I want to add a few more voice commands.
To do that I could once again come in and press and drag a wire or alternatively with the objects selected on the canvas here in the Properties inspector I can now click in the upper right hand corner of the interaction area to add my next interaction.
Once again I want it to be voice I want it to auto animate between the artboards I'll go ahead and type a command and because I am connecting artboards here in the Properties inspector I'll go ahead and select Yosemite from the list here and that I'll hit Return.
Now some nice subtle things have happened in the interface that I'd like to point out.
First off when you look here at the wire going from this first artboard to the second there's a circle with a number 3.
If I roll over that wire a tooltip appears letting me know that I've applied one tap gesture and two different voice gestures so I get some nice information there.
It's also much easier to delete wires once I've defined them so I could come let's say to this antelope artboard click on a drag gesture that I've defined and then just type Delete on the keyboard to delete the wire between that artboard and the next.
Another productivity enhancement is if I have a master instance of a component and I apply interactions, whenever I drag out instances of that component the interactions will apply to the instances as well.
Well obviously this is an incredible new capability. I encourage you to give it a try.