VividQmanufacturer of augmented reality game holograms, has partnered with waveguide designer dissolution to create new 3D holographic imaging technology.
The companies said the technology was nearly impossible just two years ago. They say they designed and produced a “waveguide combinator” that can accurately render 3D content with varying depths simultaneously in the user’s environment. For the first time, users will be able to enjoy an immersive AR gaming experience where digital content can be placed in their real world and they can interact with it naturally. and comfortable. This technology can be used for wearable devices, that is, AR headsets or smart glasses.
The two companies also announced the formation of a commercial partnership to develop new 3D waveguide technology towards mass production readiness. This will give headset manufacturers the ability to start their AR product roadmap now.
VividQ
The first augmented reality experience seen to date through headsets such as Magic Leap, Microsoft HoloLens, Vuzix, and others, produces 2D stereoscopic images at fixed focal length or one focal length at a time. point. This often leads to eye strain and nausea for the user, and doesn’t provide the immersive three-dimensional experience needed — for example, being unable to interact with objects that are naturally within reach and they are not properly positioned. body in the real world.
To deliver the kinds of immersive experiences needed for AR to reach the mass market, consumers need a sufficient field of view and the ability to focus on 3D images at any natural distance – anywhere from 10cm to optical infinity, and at the same time – the same way they would naturally with physical objects.
The waveguide combiner is the industry’s preferred method for rendering AR images in a compact form. This next-generation waveguide and accompanying software are optimized for 3D applications such as games, which means consumer brands around the world can tap the full potential of the market. school.
Waveguide (also known as ‘combiner’ or ‘waveguide combiner’) provides a lightweight and conventional front panel (i.e. looks like a regular glass lens) for AR headsets and needs necessary for widespread application. In addition to form factor advantages, waveguides on the market today perform a process called pupillary duplication. This means they can take the image from a small display panel (aka ‘eye box’) and effectively make the image larger through creating a grid of copies of the small image in front of the viewer’s eyes – a bit like a periscope but instead of one view, it creates multiple views. This is essential to making AR wearables handy and easy to use.
Small eye boxes are notoriously difficult to align with the user’s pupils, and the eyes can easily “drop” images if they are not aligned correctly. It requires the headset to be properly fitted to the user, as even changes in the Intra-Parallel Distance (IPD) of different users can mean that they may not be able to look directly at them. eye box and can not see the virtual image.
Since there is a fundamental balance between the image size (which we call the “eye box” or the “escape pupil”) and the Field of View (foV) in the display, this duplication allows the optical designer to learning to make the eye box very small, it is up to the copying process to provide a large effective image to the viewer, while maximizing the FoV.
VividQ CEO Darran Milne said: “There has been significant research and investment in technology that can create the kinds of AR experiences we dream of, but they don’t work because they are. cannot meet even basic user expectations.” “In an industry that has seen its hype, it can be easy to dismiss any new invention as more of a similarity, but a fundamental problem has always been the complexity of displaying 3D images placed in the real world with a good field of view and with an eyebox large enough to accommodate a variety of IPDs (the user’s interpupillary distance or distance between the pupils), all encapsulated in a lightweight lens.”
“We solved that problem, designed something that was manufacturable, tested and proven it, and established the manufacturing partnerships necessary to mass produce them,” Milne added. . It’s a breakthrough because without 3D holograms you can’t deliver AR. Simply put, while others are developing 2D displays to wear on your face, we have developed the window through which you will experience the real world and the digital world in a place.”
Conceptual image for a simulation game in which users can interact with the digital world at their fingertips.
VividQ’s patent-pending 3D waveguide combiner is designed to work with the company’s software, both of which can be licensed by wearable manufacturers to build product roadmaps wearable device. VividQ’s hologram rendering software works with standard game engines like Unity and Unreal Engine, making it easy for game developers to create new experiences. 3D waveguides can be manufactured and supplied on a large scale through VividQ’s manufacturing partner Dispelix, a transparent waveguide manufacturer for wearables based in Espoo, Finland.
“Wearable AR devices have huge potential around the world. For applications such as gaming and professional use, where users need to be immersed for long periods of time, it is important that the content is in true 3D and placed in the user’s environment,” said Antti Sunnari. , CEO of Dispelix, said in a statement. “This also fixes the problems of nausea and fatigue. We are excited to partner with VividQ as our groundbreaking 3D waveguide design and manufacturing partner.”
At its headquarters in Cambridge, UK, VividQ demonstrated 3D waveguide technology and software to device manufacturers and consumer technology brands, with whom VividQ is working closely to deliver next generation AR wearable. Breakthrough in AR optics means gaming 3D holograms are now a reality.
The mission achieved by the companies is described as “almost impossible” in a research paper to be published in the journal Nanophotonics in 2021.
Current waveguide combinators assume that the incident light rays are parallel (hence the 2D image) because they require light reflected around inside the structure to all follow paths of the same magnitude. long. If you put diverging rays (3D image) all the light paths will be different, depending on the position of the input 3D image from which the ray originates.
This was a big deal because this actually meant that the extracted light traveled different distances and the effect, as shown on the image above, is seeing multiple partially overlapping copies of the image. input image at random distances. This makes it essentially useless for any application. In addition, this new 3D waveguide combinator can adapt to diverging rays and display the image of the couch accurately.
The 3D waveguide from VividQ consists of two elements: First, a modification of the standard pupillary reconstruction waveguide design as described above. And second, a hologram calculation algorithm corrects for distortion due to the waveguide. The hardware and software components work in harmony with each other and therefore you cannot use the VividQ waveguide with anyone else’s software or system.
VividQ employs more than 50 people in Cambridge, London, Tokyo and Taipei. The companies start working together at the end of 2021. VividQ was founded in 2017 and can be traced back to the UK’s photonics department at Cambridge University and Judge Cambridge Business School.
To date, the company has raised $23 million in investments from deep tech funds in the UK, Austria, Germany, Japan, and Silicon Valley. When asked what the inspiration was, Tom Durant of VividQ CTO said in an email to GamesBeat, “Understand what the limitations are and then figure out how to work around them. Once we’ve identified that path, our multidisciplinary team of researchers and engineers in optics and software tackles each one in turn. Instead of treating this as just an optical problem, our solution is based on hardware and software that are tuned to work in tandem.”
As for how this technology differs from competing technologies, the company says that the waveguide combiners currently on the market can only display images in two dimensions at a set focal distance. This is usually about two meters in front of you.
“You can’t bring them closer to focus on, or focus through them to other digital objects that are far away,” the company said. “And when you look at these digital objects floating in front of you, you can very quickly experience eye strain and VAC (accommodation conflict at the destination), causing a feeling of nausea. For gaming, this makes it very limited. You want to create an experience where the user can hold an item in their hand and do something with the item without the need for a controller. You also want to put a lot of real-world digital items in a permanent location with the freedom to focus on them and the real objects as close as you want, which leads to a strong sense of immersion.”
GamesBeat’s Faith when covering the game industry as “where passion meets business.” What does that mean? We want to let you know how important the news is to you — not just as a decision maker at the game studio, but as a fan of the game. Whether you read our articles, listen to our podcasts or watch our videos, GamesBeat will help you learn about the industry and enjoy interacting with it. Explore our Briefings.