“With Razer, we found a partner very eager to innovate and push the boundaries, it was a very fast process. We were able to integrate the core technology from Basslet into the Razer headphones. But going B2B, hardware is only one part of the solution, applications is what developers need and we went more into software. If you don’t drive the haptics in the proper way, you don’t get the adequate haptic experience” he continued.
According to Büttner whose background is in audio and music, to make it possible for any game developer to design their games and include the haptic experiences, the process should be as simple as creating sounds. For this purpose, Lofelt has developed signal processing algorithms that extract all the necessary information from the audio to drive the haptic signal in real-time.
“For content creators, it is very lean, practical and easy to implement. They already have the sounds in their libraries, they only need to pass the audio files through our DSP” the CEO explained.
Talking about latencies, Büttner said converting audio to haptics takes less than 5ms.
“The Razer Nari Ultimate headset has one haptic engine in each ear cup, in effect providing haptic stereo, so when you here an explosion or a gunshot in a game, you also feel localized haptics. Studies have shown that gamers’ reaction times are significantly faster with haptics, because your physical reaction time is faster (than hearing)”.
As Razer demonstrated during CES with its HyperSense prototypes, the next generation of PC peripherals could constitute an entire ecosystem where the chair, the mouse, the wrist rest pad or game controller and the headphones would provide together synchronized full-surround haptic feedback.