8 Prototypes That Will Change The Future

8 Prototypes That Will Change The Future

Holographic smartphones and exoskeleton gloves to be able to feel using virtual reality (VR). In these prototypes of soon-to-be iconic products, you can still glimpse the sweat it took to build these breakthroughs presented this year at CHI 2016.

Each year, the Association for Computing Machinery, the world’s most significant science and education computer company, meets to explore the future of computer interaction in a traditional conference called CHI. It’s a fantastic event where thousands of researchers, scientists, and futurists gather to expand the boundaries of what it means to interact with machines.

A prototype is a draft version of a product that allows you to explore your ideas and show the intention behind a feature or the overall design concept to users before investing time and money into development.

It’s a dizzying collision with enough ideas about what the future of man and machine will look like to make the science fiction writers of the world forever out of their jobs. This year’s CHI 2016 conference in San Jose was no exception – but among the hundreds of projects, eight are outstanding. “Building a prototype without a technical co-founder is challenging but absolutely doable,” says D’Vaughn Bell CEO of the Marqui Management marketing agency.

Haptic Retargeting

The problem: In VR, objects can look real, but they do not feel real. In fact, in most cases, they do not feel at all. Therefore, Microsoft Research has developed a system that uses an insufficient number of physical props to replace a virtually unlimited number of virtual objects.

They call it “haptic retargeting,” and it makes a VR user think he’s not interacting with the same prop over and over again. To do this, a user’s virtual view is so distorted that the object he believes he can reach in the game is the physical support with which they have already interacted in the meat compartment.

One of the essential early steps in the inventing process is creating a prototype–which, simply defined, is a three-dimensional version of your vision.

It’s hard to put into words, watch the video above. In it, Microsoft Research shows how haptic retargeting can be used to convince a VR Minecraft player to stack many blocks while moving only the same block physically.

Dexta Haptic Gloves

Speaking of haptics, another company with a very different plan to enable people to “feel” virtual reality is Dexta Robotics, which has developed a series of exoskeletal-style gloves that push VR back.

That’s how they work. When entering virtual reality, the Dexta Dexmo gloves simulate feedback by locking and unlocking finger joints as you try to touch digital objects with varying degrees of force. With this relatively simple technique, the gloves can simulate haptic sensations such as hardness, elasticity, softness and more.

As Dexta Robotics explains, “How will this force feedback technology affect your VR experience? In the virtual environment, you can feel the difference between elastic and rigid virtual objects, and you will have a weapon and have a subtle, you can choose one pick up a virtual object and recognize by touch whether it is a rubber ball or a stone. ”


What happens if your smartphone can read your mind? This is precisely what Microsoft Research is trying to achieve with “Pre-Touch Detection.” It showed a new kind of smartphone that can detect how it is recognized and also identifies when a finger comes near the screen. This could open up new possibilities for the mobile user interface by improving the accuracy of touching small screen elements and dynamically adjusting the surface of the screen, depending on how they hold their device or if there is a finger approaching the screen. For more details read our full article on the new touchscreen here.

A prototype is an early sample, model, or release of a product built to test a concept or process or to act as a thing to be replicated or learned from.


What if paper could be just as interactive as a touchscreen? Researchers at the University of Washington, Disney Research, and Carnegie Mellon University have discovered how to accomplish this by giving a sheet of paper the ability to recognize its environment, respond to gesture commands, and connect to the Internet of Things.

It’s called PaperID and uses a printable antenna. The possibilities are intriguing: Imagine a physical book linked to your Kindle eBook, so flipping a page in the real world also changes the page of your e-reader. Alternatively, imagine a page of notes that can detect the movement of a wand.


A smartwatch screen is not big enough to allow many interactions. You have room to tap or swipe the screen, but that’s it. SkinTrack is a new technology developed by the Future Interfaces Group of the Human-Computer Interaction Institute that extends the “touch screen” of your watch over your entire hand and arm by wearing a specially designed ring.

Imagine you’re using your palm to call your Apple Watch by holding a finger over your hand and using it as a cursor on a keypad. SkinTrack could even be used to play more demanding video games on your portable device by allowing a much more extensive library of gestures to control what’s happening on this postage stamp sized screen.

Rapid prototyping is a group of techniques used to quickly fabricate a scale model of a physical part or assembly using three-dimensional computer aided design (CAD) data. Construction of the part or assembly is usually done using 3D printing or “additive layer manufacturing” technology.


Materiable is a physical interface of moving “pixels” developed in 2013 by MIT’s Tangible Media Group. Materiable gives this existing Inform ad the ability to mimic the tactile properties of real materials like rubber, water, sand and more. Depending on the settings, flipping the surface of an inform can cause all its pixels to become wavy or squeak like jelly or even jump like a rubber ball. Everything is accomplished by giving each Inform pixel its own ability to detect pressure and then react with simulated physics. It’s like a big block of form-changing digital sounds that can be used by designers, medical students and even convulsive artists in a variety of surprising ways.


Remember that the flexible screen was covered by Co.Design a few months ago? Holoflex is the next generation of this screen: a flexible smartphone that you can interact with. The Holoflex is groundbreaking. Two people who saw a 3-D teacup looked from the right perspective, no matter where they concerned the screen.

The Holoflex can do that nice trick by projecting 12-pixel-wide circular “blocks” through more than 16,000 fisheye lenses. It is currently a little resolution (160 x 104 less than the original Apple II), but give this technology five years, and we could all walk around with holographic iPhones. Read more about Holoflex here.


Augmented reality headsets such as Microsoft’s HoloLens allow wearers to simultaneously see the physical and the digital, but the lenses they depend on have such a small field of view that the effect can quickly be ruined. The same goes for VR, where the scene you look at often looks like it’s at the end of a long tunnel.

Microsoft has thought about this problem, and SparseLight is its solution. The idea is that you can expand the field of view into head-mounted displays by placing second grids of LEDs in the peripheral area of view of a user. Because the peripheral view of man is so much blurred than what we are looking at directly, these LEDs are mostly just the color and brightness of an object – and fool our minds that we think we see the whole thing.

SparseLight technology can be applied to both virtual reality headsets and augmented reality headsets to increase immersion. Microsoft Research even estimates that some of the travel sickness issues associated with wearing these headsets can be reduced.

Posted by on May 12, 2016