Test APL Skills in the Developer Console Simulator
Use the simulator in the developer console to test your Alexa Presentation Language (APL) skills with different screen sizes.
Prerequisites: Minimum custom-skill configuration for testing
Before you test an APL skill in the simulator, make sure that your skill meets the minimum configuration for a custom skill:
- Create an interaction model with an invocation name and at least one custom intent with sample utterances.
- Build your interaction model without errors.
- Configure a valid endpoint and deploy your skill code to your endpoint.
- Configure the APL interface (
Alexa.Presentation.APL
).
For details about configuring your skill in the developer console, see the Build page.
For details about enabling the APL interface, see Configure a Skill to Support Alexa Presentation Language (APL).
Test an APL skill in the simulator
To open the developer console simulator
- Open the developer console, locate the skill you want to test, and then click Edit.
- Click the Test tab.
- For the Skill testing is enabled in option, select Development or Live.
- To display the simulator, select the Device Display check box.
- Under Alexa Simulator, invoke your skill, and then enter your test utterances.
- To see how the content looks on a device with a screen, scroll down past the Skill I/O section.
Test your APL responses on different screen sizes
The test simulator displays your APL content similar to how it appears on devices with screens. To see the screen simulator, scroll past the Skill I/O section.
UI element | Description |
---|---|
1 |
Display – Select to view the screen device simulator. The simulator displays after the Skill I/O section. |
2 |
Viewports drop-down list – Select from the different viewports to preview. Selecting a new viewport resets your skill session. Therefore, choose the viewport to test before you invoke your skill. For details about the available viewports, see Test different viewports. |
3 |
Smart Motion Simulator On – Select to open a simulator for devices that can rotate the screen and turn to face the user, such as the Echo Show 10. This toggle appears when you select the Large Hub viewport. For details about using the smart-motion simulator, see Use the smart-motion simulator. |
4 |
Alexa Simulator tab – Enter your utterances in the field to invoke and test your skill. You can also click the microphone to test by voice. |
Test different viewports
You can select different types of devices from the drop-down list after the Skill I/O section. The selected device determines the data provided in the context.Viewport
property in the request sent to your skill.
The available devices correspond to the viewport profiles available in the viewport profiles package for use in an Alexa Presentation Language (APL) document:
- Hub Round Small (480 x 480)
- Hub Landscape Small (960 x 480)
- Hub Landscape Medium (960 x 600)
- Hub Landscape Large (1280 x 800)
- Hub Landscape Extra Large (1920 x 1080)
- Hub Portrait Medium (1080 x 1920)
- TV Landscape Extra Large (960 x 540)
- Mobile Small (600 x 1024)
- Mobile Medium (1024 x 600)
- Mobile Large (1280 x 800)
For details about the properties of these viewports, see Alexa Viewport Profiles Package.
You can also create a custom device profile for testing. A custom profile is useful for testing how your skill looks on other types of devices. You can also use a custom profile to test how your skill works on a device with a screen that doesn't support video playback.
To create a custom device profile
- In the developer console simulator, below the Display Skill I/O section, click the drop-down list to open the list of standard viewports.
- Scroll to the bottom of the list, and then click Add Custom Device.
-
From the drop-down menus and fiels, change the properties of the viewport that you want to view:
You can set the Shape, Pixel Width, Pixel Height, and Pixel Density. You can also select the specific Video Codecs the device supports, or Disallow Video to create a device that doesn't support video playback at all.
- Click Apply.
The custom device remains available during your current browser session. If you close the browser, and then later reopen the simulator, you must recreate the custom device.
Use the smart-motion simulator
The smart-motion simulator displays your APL document next to a top-down view of a motion-enabled device with a user. In the APL document view, interact with your document. In the motion simulator, move the user icon to see the device follow and turn to face the user.
The simulator is available for the Hub Landscape Large viewport. If you don't see the toggle to turn on the simulator, make sure you have selected Hub Landscape Large.
UI element | Description |
---|---|
1 |
Device icon – Represents a device with screen that can rotate. This icon rotates to illustrate how the device moves in response to your skill. |
2 |
User icon – Represents the user. Click this icon to simulate the user speaking the wake word. Click elsewhere in the simulator to move the user to a different location, and then see how the device responds. |
3 |
Device field of view – The shaded area shows the area that the device camera can see. Click anywhere in the simulator to move the user to that location. |
4 |
Rotation / motion commands – Displays the current device rotation and the most recent motion command. |
Simulate the wake word response
When a user says the wake word and invokes your skill, the device responds with built-in motion, such as turning the screen to face the user. This built-in motion is called the wake word response. You can configure the wake word response your skill uses.
You can test the wake word response for your skill in the simulator and simulate the user moving around the room.
To test the wake word response for your skill
- In the developer console simulator, below the Skill I/O section, click the drop-down list to open the list of standard viewports.
- Select the Large Hub viewport, and then select the Smart Motion Simulator On option.
- Invoke your skill.
- In the simulator, hover over the user icon until you see Click user to say wake word.
-
Click the user.
The device icon responds according to your configured wake word response.
For example, if the wake word response is
turnToWakeWord
, the device rotates to face the user icon.
To simulate the user moving around the room
- In the developer console simulator, below the Skill I/O section, click the drop-down list to open the list of standard viewports.
- Select the Large Hub viewport, and then select the Smart Motion Simulator On option.
- Invoke your skill.
- To move the user to a new location, click anywhere in the simulator.
For details about the wake word response, see Built-in smart-motion behavior (wake word response).
Test smart motion and entity sensing
With the smart-motion APL extension, you can get information about the motion state of the device and run commands to control the motion. With the entity-sensing APL extension, you can get information about the user the device detects.
When you use these extensions in your skill, you can test them with the simulator. For example, when your skill runs extension commands, such as GoToCenter
, you see the device icon move in the motion simulator.
You can test the following features in the smart-motion extension:
- All commands, except the
PlayNamedChoreo
command - All live data properties
- All event handlers
You can test the following features in the entity-sensing extension:
- All environment properties
- All live data properties
- All event handlers
Smart-motion simulator limitations
The simulator has the following limitations:
- You can't test the smart-motion
PlayNamedChoreo
command. - In the smart-motion extension, the
availableChoreos
environment property isn't available.
Related topics
Last updated: Nov 28, 2023