How Digital Twins and Machine Teleportation are transforming smart manufacturing: A Q&A with Eclipse Automation and Interaptix

How Digital Twins and Machine Teleportation are transforming smart manufacturing: A Q&A with Eclipse Automation and Interaptix
Tracey Thomas, Content Communications Specialist


Tracey Thomas, Content Communications Specialist

The phrase “machine teleportation” sounds like it was pulled from a sci-fi film, and if you’ve ever seen it in action, that’s exactly how it can feel: the future of 3D visualization made real. For the uninitiated, machine teleportation is one application of a broader set of technologies called Radiant Field Capture, which is making the process of 3D data capture radically faster, easier, and more flexible.

To better understand this rapidly maturing technology, and the broad range of possibilities it offers for automated manufacturing, we sat down with two experts in the field: Brad Zalishcuk, who leads the Digital Manufacturing Transformation team at Eclipse Automation, and Bardia Bina, CEO of Interaptix, a leading player in spatial computing and digital twins for more than a decade.

In This Article

  • Radiance field technology and Gaussian splatting create detailed 3D representations of factory environments and machinery, using everyday mobile device photos/videos, requiring no specialized equipment.
  • Unlike traditional 3D scanning that produces “blobby meshes,” this technology reconstructs reality as cameras see it, making it ideal for complex factory environments with pipes, wires, and machinery.
  • Practical applications include immersive sales experiences, collaborative project reviews, remote support, and “machine teleportation” for virtual fitment checks.
  • This technology enables the shift from 2D to 3D interfaces, offering fundamental efficiency gains for manufacturing’s inherently 3D operations.

Can you summarize what radiance field technology is and why it matters for the manufacturing sector?

Bardia: Radiance field techniques are ways of generating highly detailed, accurate 3D representations of physical objects and environments, using photos and videos that can be recorded using everyday mobile devices. What’s really unique and powerful here is that anybody can do this with the equipment in their pocket. In contrast to traditional 3D modeling techniques, you’re not generating a 3D geometry—you’re essentially recreating what you would see from every point in 3D space. Gaussian splatting technology is based on radiance fields, and what’s exciting about it is that the methods to render are far more efficient, so it lends itself to more real-time applications, and you can use conventional graphics pipelines.

Why is this important for the manufacturing sector in particular?

Brad: Because you can capture the necessary imagery from practically any camera, any lens, any device. In our facilities, we have eyes everywhere: security cameras, cameras on AMRs driving around, cameras on the phones in our pockets, all capturing data on a day-to-day basis. This enables you to digitize your facility in real time or near real time, so you can quickly visualize changes over time and leverage that data to reconfigure things. But the big value here is where we’re going in this era of physical AI. If you think about world models—Google just released Genie, which can generate real-time 3D worlds with interaction capabilities—this is huge for robotics simulation. Through camera feeds, if you continually digitize your facility, you can fine-tune world models on your data to create infinite amounts of synthetic data.

But 3D scanning has been around for years. Why is this different?

Brad: I’ve been exploring 3D scanning and photogrammetry for a while, and there are a few technologies for doing it, which each have limitations. Laser scanning is very accurate for large spaces, but it falls apart with complicated machinery, pipes, wires, messy environments, and so on. You need laser bounce-back to capture data, and it sometimes requires markers and training. The result is often a blobby mesh that’s not representative of reality. With radiance fields, it’s a 3D image, so what you see is what you get. As long as your capture techniques are good, it reconstructs everything and is scaled accurately. It’s reconstructing reality from the lens of a camera, which is exactly how we see the world.

What is “Gaussian splatting” exactly?

Bardia: With a conventional 3D model, you have solid geometry, meaning mesh and textures. With Gaussian splatting, you don’t have that solid geometry. Instead, you have points in space with properties like position, orientation, size, and opacity. They’re called “splats” and follow a Gaussian distribution in shape. As you look through them with transparency, you see the right representation from that viewpoint. It’s an iterative process starting with a sparse point cloud and going through many iterations to make all viewpoints match the original dataset of images.

What’s the process of actually creating a model like this?

Brad: Very simply, you pull out your phone, change the settings to reduce motion blur, and start shooting video. If it’s an object or machine you’re capturing, you orbit around it at multiple levels in portrait mode. You want to capture all directions—think of it as if you were going to navigate to that position in 3D space. The key is parallax, so you want to move the camera in space, not just pivot in one spot.

Bardia: Creating a good model is a three-step process: capture photos or videos, upload your data to be automatically processed, and within a couple of hours you have results you can share globally. There’s nothing manual beyond uploading—it all gets processed in the cloud.

How would people view or interact with this 3D model?

Brad: Multiple ways. There are web viewers like Super Splat, you can use Omniverse via cloud or local applications, and existing software like Unreal Engine and even Adobe Premiere have Gaussian splatting plugins. But the biggest platform is XR devices, especially the Apple Vision Pro, because of its rendering capability. It’s one-to-one scale and feels no different from being there.

Bardia: Most people don’t know how to interact with 3D models though. Even computer-savvy individuals come from 2D interfaces. So when we make software applications like doppl that leveraging this technology, we try to make interactions as intuitive as possible, so it’s like interacting with the physical world. You put on the headset, see yourself in that environment, and interact very similarly to the real world.

What are the practical business applications?

Brad: The lowest-hanging fruit here is definitely in the sales experience. At Eclipse, we’re creating Gaussian splats of all our machines so every sales engineer can show clients our catalog in collaborative one-to-one XR. You’re seeing this in other categories: Zillow is using Gaussian splatting to capture homes and properties, Nike is doing full 3D viewers of actual shoes on websites.

Second is collaboration throughout project deliveries, where we can pipe CAD changes directly to headsets for stakeholder feedback.

Third is remote support. Instead of 2D video feeds, I can come to your facility while you’re there on a headset, so I can see the same reality you are and provide guidance.

This technology helps us deliver better buying experiences where customers co-create with us. But to take it one step further, customers can also use this technology to showcase how they manufacture products to their own customers. Imagine showing a customer how Nike actually produces shoes, or how Kellogg’s makes its cereal. It’s about digitizing factory operations first, then involving end customers in co-creating their products.

Is this what you mean by “machine teleportation?”

Brad: Exactly. It’s a way of effectively “teleporting” a machine or a location anywhere in the world, instantly. The only thing you can’t do is physically touch it. It’s a very low-cost, democratized way to put a machine into a new location for fitment checks or bringing it back for retrofits. It’s where reality and digital meet together.

What are the biggest limitations right now?

Brad: Because radiant field technology is advancing so rapidly, it can be hard to settle on a tech stack—something better is coming out practically every day. Also, lower-quality captures are hard to reconstruct from, so you need proper capture techniques, though that’s improving with advances like 360 cameras.

Bardia: Yes, quality is heavily dependent on input images. Even though the process is simple, you need to understand how to take the right types of images. It’s like riding a bike: everyone can do it, but to do it well you need to understand the theory, and then practice a bit. Also, Gaussian splats aren’t always compatible with traditional CAD or 3D printing software, though that’s evolving quickly.

What’s your strongest argument to a manufacturer who’s unsure about implementing the technology?

Bardia: The factory automation sector is at a pivotal point in how we interact with computers. After five decades of 2D interfaces, we’re shifting to 3D interfaces within the decade. Manufacturing is inherently 3D: designing 3D objects on 3D machinery in 3D facilities. The efficiency gains from interacting in 3D rather than 2D are fundamental.

Brad: In the near-term, this technology has the potential to transform the way you market and demonstrate physical products for very little cost. Instead of site visits for RFQs, manufacturers could provide 3D splats of their facilities for vendors to visit virtually and build concepts within. Long-term, it’s pure strategy for physical AI: having a thousand eyes in your facility generating queryable updates to understand day-to-day changes and build mental models for future AI simulation.

Interested in learning more about how radiant field technology can transform the way your customers and collaborators interact with your factory? Eclipse brings years of experience with 3D data capture and cutting edge tech capability to every project. To learn more about how we can put this expertise to work for you, book a discovery call.

How Digital Twins and Machine Teleportation are transforming smart manufacturing: A Q&A with Eclipse Automation and Interaptix

Get in touch to explore how our Advanced Engineering Services can help you move faster without compromising results.