Using cameras to capture hand movement in front of a display, enabling interaction with onscreen content without physical contact between the hand and the screen.
See Gesture Recognition and Touch Emulation options:
Computer vision technology can identify age, gender, group size, views, impressions, and more in real time. Use Intuiface to both capture this information for analysis - such as an ad's audience metrics - and to trigger onscreen content uniquely personalized for predefined demographic categories.
Choose from a range of supported computer vision options:
Similar in concept to beacon technology, RFID/NFC systems uniquely identify tagged items. Intuiface can communicate with any RFID/NFC reader, such as those from Nexmosphere, using the captured identity to show individualized information or trigger any of 200+ possible actions.
Proximity and motion detectors - like those from Nexmosphere - can alert Intuiface to the presence or approach of onlookers. Use them to trigger events that capture audience attention and draw them in for further engagement.
Intuiface supports use of the spoken word, capturing information or reacting to commands activated from Amazon Alexa or Google Home with any of 200+ possible on-screen or behind-the-scenes actions. Crucial for creating accessible experiences for those who can't effectively touch the display.
The Internet of Things is the universe of network-accessible devices that can send and receive information, everything from room lights and thermostats to your refrigerator. Intuiface can communicate with and direct all of these devices in real time.
Intuiface can take any URL as input and display the corresponding QR code. Enable Mobile Activation by creating take-home information on the fly without knowing in advance which information will be requested. Perhaps use this in combination with Intuiface's beacon support!
With its support for web triggers, Intuiface can receive navigation and selection commands from web content running on mobile devices. Alternatively, third party software can send mouse commands from a personal device to a display, simulating tap and drag gestures.
"To both thrill and satisfy an audience, you have to give them what they need, not just what they dream. Touch is the necessary minimum but there is so much more."
- Chloe Canella, Head of Digital Marketing - Intuiface
Tangible objects are items whose presence and orientation can be detected by a display. Intuiface can react to these objects - aka tangible object recognition - on any display/middleware combination supporting the TUIO protocol, treating them as unique identifiers as well as points of interest for displaying interactive content.
With beacon support, your Intuiface experiences can uniquely identify select items, be notified when approaching an item, or broadcast URLs in response to user choice. Create lift-and-learn scenarios for your store or personalized browsing experiences for your museum.
Haptic displays both receive and transmit touch input. The ability to receive input is common to all touch displays; it's the transmit part that makes haptic unique. Haptic displays transmit a sense of texture, of friction, enabling users to "feel" the item depicted onscreen. Thus the name "haptic" - the sense of touch.
With Intuiface you can map key presses to any of over 200 supported actions including turn page, play video, and maximize image. Combined with other interactive options, touch experiences become accessible for the physically and visually impaired.
Intuiface supports on-the-fly conversion of both static and dynamic text into speech for the visually impaired. Designers can configure both the voice and speed, using any language available on the target device to ensure proper clarity and intonation.
Intuiface experiences can communicate with one another across any network. Triggers in one experience launch any of 200+ actions in another. And a remote action API enables 3rd party apps to control / be controlled by Intuiface experiences from afar.