Amazon Alexa

Skills Testing Simulator


The Alexa Testing Simulation Framework, a set of services and SDKs allowed simulation of specific Alexa devices to be implemented with ease on web-based and programmatic simulation environments. The release paved the way for all future testing solutions and closed the feature gap between testing simulators and actual devices on multi-turn conversation, dialog management, device rendering, and more.

Team + My role.

  • 1 UX Designer (me)

  • 2 Product Managers

  • 8 Developers

I led the design team on the complete redesign of Alexa Skill's Test Simulator. We dramatically simplified the user experience, focusing on core interactions, speed, and simplified navigation. Making it beautiful in the process was just icing on the cake.


Historical context.

Previously, the testing simulator on the Echo Developer Website (EDW) was missing major simulation features such as voice input, session management, and device event logs. Additionally, the website was maintained manually and introduced engineering overhead every time new features were launched on the Alexa platform.



Primary goals of the redesign.

The launch of the redesigned Alexa Simulation Framework brings major improvements to the simulation problem. It comes with a core library that implements the full AVS device spec and communicates to Alexa skills via the same set of AVS APIs that is used by actual Alexa devices. This means parity with the devices can be maintained with little engineering effort.

The current device simulation capability on the Alexa Developer Portal is limiting. I have no idea how my skills will be rendered on devices such as Echo Spot or Fire TV and I cannot test against Echo Buttons at all. The only way to get my skill fully tested is to purchase all devices from Amazon, and even after that, managing the devices is a major operational overhead on our QA team.

" amazing experience to test skills in the browser should you not have a device ready. I love the beta already!"

- Dustin Steiner



Dialog Simulator.

Developers can now validate Alexa’s understanding of the spoken word and how the skill responds with the simulator’s new voice input capability. The new dialog style UI pattern helps keep track of the interaction with Alexa and allows for easy inspection of relevant debugging information. The interaction with Alexa is retained in a dialog format on the left pane to allow users to click back and forth between the dialog bubbles to resurface relevant information in the right pane.


Device Renderer.

In order for skill developers to fully and effectively test their skills, they need to be able to see the visual display that is shown to the end-user for devices with a screen. With the Echo Show having launched, developers who did not own the device needed a way to build and test the visual experience for their skill responses. Developers can now see what their skill looks like on the Echo Show device.


Skill I/O + Device event logs.

The new device event log shows the directives that are sent to the devices, which allows you to understand how the skill interacts with the device. To further improve programmatic automated testing, we are also enhancing the skill-testing API to support features such as entity resolution and dialog management. We are also now surfacing the AVS directives that are sent to the devices to allow users to fully debug their skills. The new device event log shows the directives that are sent to the devices, which allow developers to understand how their skills are interacting with the device.

Screen Shot 2018-06-11 at 9.40.01 PM.png


What the community is saying.


What the press is saying.