The Future of Touchscreens – Where Will We Be In 10 Years?
Today our mobile phones instantly send money across the world, take inner ear scans and remotely control the temperature of our home. There is only one thing we can’t easily do on the smart phone: writing. Any detailed letter, in-depth composition or data entry quickly devolves into a nightmare of pecking clumsily at virtual keys. Typing this way is unnatural and the touchscreen still has a long way to go to enable life-improving productivity.
The key to fluent typing is 3 essential components: touch input, orientation, and confirmation. A traditional keyboard has all three to satisfy the loop. Blackberry was a pioneer in its time for developing something close, but eventually lost out because its user experience fell short of what Apple created next.
The iPhone, offering a larger and interactive touchscreen, entirely changed the conversation and the market with its content consumption-centric vision. But this came at a cost. Touchscreens have touch input – one of the efficiency circle’s parts — but they are missing the other two. You cannot orient your fingers by feeling the physicality of keys, nor do you have a non-visual way to confirm whether you pressed a key down. By making it difficult to develop muscle memory, touchscreens also make it difficult to get into any kind of creative flow. Even today, you still hear people say, “the one thing I miss about my Blackberry is how much easier it was to type.”
The fact is that the smartest phones in the world hinder our productivity and creativity by providing a flat touch screen experience instead of something tactile. But two new designs are emerging: ones that finally allow smartphones to be used for both consumption and productivity.
How Biology is Guiding The Future of Keyboards
Haptics, the science of simulating touch sensations, activate sensors in your fingertips with vibrations to notify your brain once you’ve “pressed” a button. These sensors are also the reason why we know we are feeling a phone screen, for example, and not dog fur. Apple’s watch and new MacBook track pad uniquely apply haptics — clicking the track pad down does not actually move it down – instead, the haptic system simulates that feeling, signaling to the brain that you’ve clicked the pad. This provides the confirmation portion of the “efficiency loop”.
Microfluidics are the yin to haptics’ yang, activating a complementary set of those sensors in your fingers that are sensitive to curves and edges. Microfluidics — “the science and technology of systems that process or manipulate small amounts of fluids” — is a way to provide dynamic, transparent tactile surfaces. It can push fluid through plastic material to fill tactile bumpers that rise and recede into keyboard formation. Pressing your fingers against microfluidic bumpers provides the third component of the “efficiency loop”: orientation. The biological explanation is proprioception, a core component of building muscle memory and learning your way around a keyboard without looking. With zero tactile feedback, it is impossible to train our muscle memory without feeling the edges and curves of individual keys or feeling a satisfying click. If we hit a stop light button, how would we confirm that it was pressed without feeling the button move down, hearing a sound, or feeling it vibrate back? Combining the technologies of haptics and microfluidics provides the orientation and confirmation needed to enable full typing productivity– closing the efficiency loop.
What types of keyboards will arise (no pun intended) in the next five years?
There will be three generations of touchscreen devices implementing haptic and microfluidic technologies. The first has already begun with systems today. Mobile device manufacturers are directly incorporating a mix of fixed microfluidic-enabled keyboards helping you orient yourself around your screen. These first-generation microfluidic devices are also very power efficient, using less than 1/10th of one percent of battery power in a day.
The second generation will be a programmable combination of the two technologies. Physical keyboards will adjust in size and provide haptic feedback accordingly rather than in fixed locations. Right now, our keyboards have the technology to learn and adjust to how we type. In the next touchscreen generation, we will see keyboards that use microfluidic tactile technology to create a personalized keyboard for each user, based on the size of their hands, fingers, and style of typing. This answers challenges that are exacerbated by switching devices. If your new phone is significantly different in size, it can feel like learning to type all over again. Your fingers frequently hit the wrong letters because your muscle memory is telling you that you should be typing on your previous keyboard. However, if keyboards were raised and the sizes were somewhat standardized, orienting yourself to a new device’s keyboard would be much easier. Your productivity wouldn’t have to suffer if you need a bigger, smaller or split screen keyboard.
Haptics and microfluidics will evolve together to include tactility and vibrations for more than just keyboards and buttons. First, keyboards will move around, and adapt seamlessly to every app or device positioning (portrait or landscape). Buttons anywhere on the screen will appear as apps and websites pop up. You will be able to see the varying contours of a map. The visually impaired can choose a braille option when navigating their touchscreen device or writing without having to buy a separate device. Haptics will later simulate clicks and vibrations of varying lengths and strength based on the activity you are doing. This is extremely likely because these technologies allow the usability and consumption of a large touchscreen but increase the amount of features when you want them. Past devices with physical keyboards took away from screen space. These technologies can be deactivated for full screen real estate when you don’t need to be interactive.
Touch is the first sense that we acquire. It fosters human connection with each other, and even with our personal technology. New complementary touch technologies will change the touchscreen game by taking static devices and turning them into something entirely new–flat one minute, a keyboard the next, then a contoured 3D map, and back to a flat screen again. Someday soon, keyboards will respond to touch, the environment, and every user’s individual needs.