Lee Andron is the Mobile Senior Principal Product Manager at Progress. With over two decades working with Internet technologies and 12 years working specifically in mobile, Lee has helped more than 35 Fortune 100 companies make use of next generation technology. Lee leverages user experience design (UX), information architecture, advanced mobile web, and web application development and design principles to develop creative solutions to customer challenges. Le2 was one of the early evangelists of HTML5 as a ubiquitous mobile solution, and co-authored the book iPhone and iPad Web Design for Dummies.
Recently, Lee and I sat down to talk about the future of human-computer interfaces, our increasingly connected world, and what that means for today’s business software application developers.
Suzanne: There are so many exciting new interfaces for computers and mobile devices doing things like using the surface around the device as a track pad, or as buttons, to extend application environments. But many of the examples I see are of consumer apps. Things like playing a xylophone or muting your alarm in the morning are cool. But where do you see this kind of technology fitting into the business world?
Lee: I agree with you that there is some pretty bleeding-edge stuff available right not, and that's the kind of thing I think is generally exciting. As a category, I would call these updates to the abilities of devices to allow users to enter information. It f started with the keyboard, and then we got a mouse. But interfaces evolved into touch-screen displays, and then quickly on to voice interactions. Now we have started moving beyond the display and into the 3D space that is in front of it. All of these are evolutions in the way people interact with computers. Whether they are phones, tablets or desktop computers. It's all evolutions of the man-machine interface. In that category, I think we're still just scratching the surface of what we can do, and what will be most natural for users.
Lee Andron, Mobile Senior Principal Product Manager, Progress Software
Suzanne: When you say “moving beyond the display,” do you mean gesture support with things like Xbox Kinect or Leap Motion?
Lee: Sure. Imagine that you're working at a sandwich restaurant. Right now, you have to touch the menu display to put in an order, then put gloves on to make your sandwich, then go back at the end and touch a screen again to take cash, and then handle money. But what if you could simply point or wave your hand at the display to scroll through sandwiches, and then point to order the right one? Or even, while you're making the sandwich, wave your hand and move on to the next one? Things like that on a pragmatic level are going to affect how you serve your customers and your business in a really big way. The crux of that idea is, "What can we do that is more natural to human beings?" And to make the metaphors we use in our user experience design ideals be much closer to the real world. A throwback example would be the tabs at the top of many web pages. We figured those out in the 90s. People were used to using manila folders, so this is a good way to emulate the real world and that interface was born.
Suzanne: What I think I hear you saying is that man-machine interaction is becoming more intuitive. I was trying to think how you might use a tool like Google Glass. You might be a line worker on a telephone pole and need a part, and you could put in an order using just your eyes.
Lee: The best real-world case I have of that is Beth Israel Deaconess Hospital. Their surgeons use Google Glass. They can receive real-time information during the surgery for things that they need for the status of the patient, and they can gesture just with their heads to change information, because when you are scrolling through a list you just have to look up and down.
Suzanne: It’s exciting to think that people with disabilities like Stephen Hawking, for example, are going to be able to communicate in new ways. The user interface and the human being are going to become more as one.
Lee: Exactly. This is the field of ergonomics, a study within psychology, and ergonomics means man-machine interface, essentially. The real crux of this science started in the 50s with the space race and putting the right stuff in front of the astronauts. It’s about making it easier for people to interface with machines. For them to be able to enter in information, and get information out. The closer we get to the natural world, the more the metaphors hold true and the easier it is for somebody to learn how to use something.
An example of Google L design
Another great example of this that is really rooted in the 2D world is Google's new design ethos. They call it "L.” What it really boils down to is, they sat with a piece of paper and looked at what would happen if you, say, picked it up and moved it. How does it interact with other papers? So you get an idea of flat design. It’s similar to an iOS7 design ethos. It's where everything is super simple, with plain colors, no 3D effects, gradients are removed. Android is adding in the physical reaction that you would get from moving a real thing. The Android design guidelines are going to make that feel so real that’s it's going to tease your brain and make you think it's the real thing.
Suzanne: Is that available right now, or in the next Android version?
Lee: In the next big release, the Android operating system will include this in the main parts of their apps. Google has published their design language so others can follow their lead as quickly as possible. If someone is making an app with our Progress® Rollbase® rapid application development tool, they will be able to design with that language so that their apps feel natural to someone used to Android.
This all bounces off the Internet of Things (IoT) too and what connected pieces really mean—sensors and stuff like that.
Suzanne: So, your typical guy sitting in front of a computer—why would he want to use some kind of new touch interface rather than his mouse and keyboard? Why would those be important for an office worker?
Lee: Super question—why do we care? Here's the answer–things like this get niche uses first, and then they are brought into normal usage by habit. Let’s consider gesture support. There's this beautiful little ring that fits on your finger that lets you do all these gestures without having to plug anything into your computer. It's Bluetooth-based. You can use it with your TV as a remote control. You're sitting across the room and want to change the channel? Just wave your hand. The first business use of something like this would be, I think, for presentations. You're in the front of the room with a projector on and you don't want to say "next slide." You don't want reach over and hit your keyboard. Wouldn't it be cool if you could just wave your arm and go on to the next slide? That’s pretty neat.
The Nod Ring allows wearers to control devices via remote control
That’s where the first devices are going to sneak in to the workplace, and there's both a lean-in experience and a lean-back experience. We usually think about the phone as a lean-in experience and the tablet as a lean-back experience. A computer really does both of those. There are times when you're leaning in and focused on something like writing an email. Then there are times when you lean back and read the news. When you're doing lean-back work, gestures will be so much easier than searching for the right button or keeping your finger on the keyboard. A wave-your-hand, natural motion—getting back to ergonomics—such a natural motion just feels easier. When you get your hands on one of these things, or watch some of the videos of people using them, you see how they break the boundaries of the 2D computing environment and let you interact with it in a different way. And by the way, in usability studies, these things actually prove correct.
Suzanne: I've read that there are going to be sensors in plant leaves and anything touchable. That seems somewhat far out there, but is that really going to happen?
Lee: Yes! You haven't dreamed it. People are using sensors in all kinds of, frankly, super weird places and things. But to bring it to a more practical level, the stuff that they're talking about is generally categorized as Radio Frequency ID (RFID) devices. RFID devices can be one-way or two way.
The one-way stuff can be used for things like package tracking. RFID chips can set off alarms if you walk out of a store without having them removed. That little chip thing? That's an Internet of Things (IoT) sensor. It provides information back to a main system.
The information provided by the sensor on its own isn't very helpful. But what is helpful is lots of that information accumulated over time and applied to specific circumstances. A big use of this is in oil pipelines. They'll have a sensor every twenty feet or so and if they start to sense a vibration in one, it might be on a scale of one to ten. But over time, if you see spikes at certain times of day or spikes coinciding with the weather, you can make adjustments. It's that kind of data that makes a difference with IoT sensors.
90% of the world's data was created in the last two years. That's due to IoT sensors starting to gather all this big data together. We need to look now at that data from a distance so we can see the forest for the trees.
Suzanne: What kind of capabilities are available now in Progress® Rollbase® that can help our users take advantage of these exciting technologies?
Lee: My specific product focus at Progress is Rollbase Mobile. Right now, I’m working on a new tool that will be called App Designer. App Designer will take the information that Rollbase has to show, and let you create any kind of interface—for the desktop, web, mobile web, tablets—any device. The idea is a single interface to design all of your apps, whether they be mobile or they be Google Glass or Pebble or anything, really.
BLE devices can be seeded almost anywhere to capture location data
Rollbase can help developers make these great interfaces for wearables, and leverage the wider data from IoT sensors. That's important. One example is a company that makes systems for the logging industry. They have a system that helps them track tree product manufacturing from chopping them down all the way through boards being cut and being in the storeroom waiting to be sold to the sales people, so they know what kind of windows can be made with this type of wood. This shows the whole process whether you are in forestry or sales, or if you're with a company that ties those two things together into another business.
One of our customers is going to use a specific kind of IoT device, iBeacons (that's the Apple trade name) or general Bluetooth low-emission (BLE) devices. These are essentially the same as RFID chips. BLEs send out a really small ping of an object’s location. In these scenarios, a given tree can be tracked via barcodes. The pings can be brought into an application built on Rollbase and trigger actions that can be pushed out to the interfaces that the people in the back office see. It can also be pushed out to people using a company’s mobile apps.
From a pragmatic perspective, integrating these technologies into mobile and web applications will be about coming up with the right scenarios for a customer. A development team can then use almost anything you can imagine in the Internet of Things to provide the right data to the apps they build in Rollbase. Right now, we have the APIs and libraries for an amazing array of wearable and sensor devices that can send information back to your application.
To get started building your own data-driven mobile application, visit Progress Rollbase and register for a trial today.
Suzanne Rose was previously a senior content strategist and team lead for Progress DataDirect.
Let our experts teach you how to use Sitefinity's best-in-class features to deliver compelling digital experiences.
Learn MoreSubscribe to get all the news, info and tutorials you need to build better business apps and sites