Intel has demoed a wireless headset that can turn users' living rooms into a 'merged reality' game.
The game replaces pre-scanned furniture and other objects with video game scenery in virtual reality.
The headset includes technology that allows it to track users' hands and where they are without any external sensors.
The technology, demoed at this week's CES 2017 in Las Vegas, will be licensed to manufacturers toward the end of 2017, according to Intel Chief executive Brian Krzanich.
Presenting the latest prototype at the CES event yesterday, Mr Krzanuch explained that the headset does not need to connect to a separate PC or power source.
It has a battery and computer built directly into the device.
Two players wearing the headset then demonstrated on stage what the technology could do.
The headset created a virtual replica of the room using scanned furniture, which Intel calls 'merged reality'.
During Intel's demo, bookcases and coffee tables were digitally replaced by video game scenery of a similar size and shape.
The replica scenery was designed to fit the game the players were demoing, using their headsets to virtually decorate the room as a futuristic space shuttle.
The headset itself will not be manufactured by Intel, but the company will instead offer the technology to other tech companies to build their own products from.
Project Alloy was first unveiled in August last year, but the newest prototype gave the technology its most advanced showing yet.
Back when it was first announced, Intel projected that Project Alloy would bridge the gap between Virtual Reality headsets, such as the Oculus Rift, and 'Augmented Reality' headsets, such as the Microsoft HoloLens, which projects holographic data onto the wearer's field of vision.
During his opening keynote at the 2016 Intel Developer Forum in San Francisco, Intel CEO Brian Krzanich unveiled Project Alloy, describing it as an 'all-in-one virtual reality solution' made from the ground up.
'Merged reality delivers virtual world experiences more dynamically and naturally than ever before – and makes experiences impossible in the real world now possible,' he said.
The firm announced it was also working with Microsoft to use the same 'holographic engine' its HoloLens uses, which will be part of Windows 10 next year.
'Next year, we will be releasing an update to Windows 10, which will enable mainstream PCs to run the Windows Holographic shell and associated mixed reality and universal Windows applications,' Microsoft's Terry Myerson said.
'The Windows Holographic shell enables an entirely new experience for multi-tasking in mixed reality, blending 2D and 3D apps at the same time, while supporting a broad range of 6 degrees of freedom devices.'
'With most VR and AR technologies available today, it’s challenging to merge physical, real-life movement and environments with simulated virtual objects, environments and actions, Mr Krzanich wrote in a post explaining the new system.
'When it comes down to it, today’s virtual reality isn’t really that virtual.
'You often need a complex set of consoles, multiple sensors and cameras, and hand controllers.
'Merged reality delivers virtual-world experiences more dynamically and naturally than ever before — and makes experiences impossible in the real world now possible.'
'Through merged reality, see your hands, see your friends … see the wall you are about to run into,' the firm said.
'Using Intel RealSense technology, not only can you see these elements from the real world, but you can use your hands to interact with elements of your virtual world, merging realities. '
That means you can 'cut the VR cord,' allowing a free range of motion with 6 degrees-of-freedom across a large space.
'This, combined with collision detection and avoidance, enables the user to utilize physical movement to explore a virtual space,' Intel says.