originally posted by Jamey Baumgardt: (link) - please comment at original post
One of the first projects I worked on here at IdentityMine was a functional demo we put together for Intergraph, a company that specializes in 911 call center dispatcher software. In cooperation with Microsoft we wanted to show them how Windows 7 and a touch-enabled interface could be leveraged to enhance their current software to make the dispatchers’ jobs easier and more efficient. Enterprise installations are an important part of any operating system update and therefore Microsoft was willing to assist with various resources during the project.
Dispatchers take emergency calls and have to be able to document information from their callers as well as access additional information from their data servers extremely quickly and accurately. It’s no joke—these people save lives, and their software needs to work seamlessly and be highly intuitive so as to minimize errors and mistakes.
Our task was straightforward, but by no means an easy one: integrate a touch interface into the current system and work-flow in such a manner that it would enhance the dispatchers’ user experience and improve their overall work performance. Our touch-enabled solution had to be user-centric, have a sophisticated interface design that could accommodate large amounts of data, and yeah by the way, it also had to actually work for the client’s demo.
Distilling the information we gathered from numerous user interviews and stakeholders, and then conducting many internal brainstorm meetings, we ultimately decided on a solution direction that incorporated the use of two monitors working in conjunction with one another. One monitor would be a standard flat screen panel that would house the data entry interface, and the other would be a touch-enabled flat panel that would serve as an interactive map interface. Both interfaces were connected and running off one Windows 7 box, and had to be able to “communicate” with each other. That is, actions carried out on the touch screen map instigated pieces of workflow on the data-entry screen, and vice versa. It couldn’t be two interfaces cobbled together, but rather had to be one seamless, natural experience.
Essentially our goal was to improve the overall workspace by centering the user’s focus, ensuring an ergonomic interaction in regards to a person’s typical range of motion, and to “extend the keyboard” with a touch-enabled surface. We learned in our interviews that the keyboard is a vital tool for dispatchers and that it wouldn’t be something they’d want to give up. So rather than replacing the keyboard with a fully touch-enabled solution, we compromised and settled on the idea of extending the keyboard through touch instead.
The touch screen itself houses the map of the area the dispatcher is responsible for, and allows them to customize their view, locate calls, and track police and aid units. Through the touch interface we were able to simplify many parts of the dispatchers’ traditional workflow. For example, to assign an available police unit to a new event, the dispatcher simply taps and drags the police icon on the map to the event icon associated with the call and voila, the unit is assigned.
Ultimately we delivered a fully functional demo that the client shared at a conference in Washington D.C. in June of this year. The demo was very well received and generated a lot of excitement and interest in touch-enabled interfaces. Certainly touch-enabled interfaces will continue to replace traditional solutions as the technology advances over time. The next several years are going to be very exciting, and I think all of us at IdentityMine feel pretty lucky to be working in such a cool industry.
Remember to please comment at original post: (link)