top of page

The Uses and Impacts of Marker-based Tracking


Allan Rankin, Target3D

In discussion with Allan Rankin, Managing Director of Target3D



Tell me why marker-based tracking is so important/prevalent, and what are its uses and impacts?


Optical marker-based tracking systems are still the go-to and highly versatile tracking systems because they provide the most accurate 3D positional and rotational information which can then be utilised in many different use cases and sectors. 


“You would have recognised its use in animation and VFX for tracking the human form for mainstream entertainment - as seen in films like Planet of the Apes and Lord of the Rings.” 

Because of its supreme accuracy, marker-based tracking, is used to understand where a person or object has been in time and space. This information can be delivered in real-time at very high frame rates or in post production where an operator can pre-record the data and play it back. 


It means people are able to utilise the system to capture what the subject is doing and they can use it to validate the actions of the subject. 


Let me explain. In robotics, for example, if you wanted to program a robotic arm to move 40cm, you can use an optical tracking system to tell you that it's moved 40cm, or indeed it may tell you that the arm has moved 40.1cm. That differentiation is really important to engineers and scientists. It enables them to understand whether they’re receiving the correct information from their programming.


One great example is understanding how robotic arms move in spaces. 


For instance, when positioning robotic arms in factory spaces, you could use an optical system to deliver the most efficient positioning of those robotic arms for configuration of a factory assembly line. 


We have clients providing position validation of robotics to mimic satellites docking in space. They use optical tracking to understand in sub-mm position and rotation if their robots will perform in space as they should. 


 

This technology can be used in a myriad of applications and places. One great example is the measurement of the deflection of objects within a wind tunnel. This could be the side wing of an aircraft or a panel of a Formula One car…a cyclist who's trying to reduce aerodynamic drag when in training or helping a manufacturer build new, efficient bicycles.


From the understanding of how a robotic arm is moving to how a medical practitioner is executing a certain task (such as tracking laparoscopic surgical equipment in an operating theatre) - Target3D has provided solutions for all.


Many other sectors use optical tracking to position someone in virtual environments, like head-mounted displays for virtual reality.


But prior to that becoming more readily available, there are solutions known as Powerwalls and CAVEs, where people wear 3D glasses and look at stereoscopic 3D projected content. That content is typically projected on multiple sides, and it re-renders based on the tracked head of the user, i.e., the headset people are wearing. So, based on how they move, the content will render the correct perspective for them. They’re immersed in virtual environments, looking at the content in relation to their position. 


Two people wearing headsets with markers in a CAVE environment

Courtesy of ST Antycip



Are there any more applications?

Flaim case study

Of course! Optical tracking is used in simulators of all kinds. This could be a crane driving simulator, a racing car simulator, or a flight simulator for helicopters and planes...and let’s not forget medical tracking. 


There’s also location-based entertainment. When people wear 3D or VR headsets, they need to understand where the game is in relation to their environment.


Police officers wearing headsets in a simulation

For example, they may have a virtual torch in a dungeon scene - they may need to interact with the scene using the torch.  How do you track that object quickly and easily? You can track the headset with markers or you can track the virtual prop with markers. 


A person in a virtual world and the real world, holding a weapon and a box

Courtesy of Existent


People wearing VR gear in a gaming room with neon lights and screens. Vibrant scene with action and excitement in a digital environment.

For instance, virtual reality escape rooms work by tracking objects that you need to interact with.


3D optical marker-based tracking gives you the opportunity to put markers on almost any object and then understand where that object is located, and transpose that information into a virtual environment.



Courtesy of Glava VR


Let’s talk about robotics.


University of York case study

This is (obviously) a wide field. We’ve talked about robotic arms, but there is also aerial and ground robotics.


Drones and pre-programmed equipment are either validated by tracking systems or used by tracking systems to produce data. This is used in autonomous vehicle research for the transportation of our near future. 


Drone air shows are popular, big scale events where multiple drones are simultaneously tracked in the sky. They tend to be programmed in very controlled environments before being let out into the real world, and those environments tend to be large tracking volumes.


We have customers in Spain, for example, who are prototyping and building drone vehicle transport for the humans of the future. They're utilising tracking systems to validate the position of the drones as they do their research, making sure they understand how they could improve stabilisation, landing, and take-off. These lab spaces are very large spanning up to 10m high and 20m x 20m in width and length. 


We have a few companies that conduct robotic research related to satellites where they mimic frictionless movement. They create replicas of satellites and test the docking procedure or remote repair of satellites in space and validate their algorithms by tracking the specific areas of the craft involved in these functions.


HAL Robotics case study

We have a client called HAL Robotics. They were working on a fantastic project where, by using tracking systems, they were validating their robotic arms. This would then become a ‘cobot’ (collaborative robots) for the construction industry.


The idea is that cobots grab construction equipment or materials and hand them to human workers as they build structures such as walls. The cobots use different types of technology to recognise the equipment and deliver those items to a human in a safe and efficient way.


A few years ago, we worked on a project with an online retail company.


Imagine you place an order online. A warehouse has massive robotic systems that pick goods, which then get placed onto a truck for delivery.


But these robotic systems need maintenance. Researchers are looking into ways to assist maintenance engineers on-site by using collaborative robots to hand them tools. Based on voice commands, an engineer might say, "Hand me a screwdriver," and the robot nearby finds the screwdriver, hands it over, and takes away the previous tool.

You still need a human to do the intricate mechanical engineering and repairs, but they're supported by robotic machinery.


Medicine is another fascinating area -I have two great examples! Firstly, using tracking systems to analyse top surgeons and how they operate.


There was a study at a London University where we supplied tracking equipment. They then examined how surgeons performed abdominal surgical procedures using laparoscopic equipment.


One highly renowned surgeon would demonstrate the textbook method to trainees. But they found that the trainees struggled to reach the same level of proficiency.


So, they wanted to track and understand exactly what the surgeon was doing—not just what he thought he was doing! There was a conscious and unconscious correlation and they found discrepancies.


The surgeon followed textbook guidance, for instance, "You must rotate your hand in this way," but when they tracked his actual movements, they noticed subtle nuances that made a difference—something he had developed over time through experience.


He struggled to convey this to trainees because he was so adept that he didn't even realise the finer details of what he was doing and the technique he had developed.


Tracking systems helped analyse how he operated with the equipment.



And the future?


I read this in an AI trade publication recently - in Beijing, this April, there’s going to be a half-marathon where humanoid robots will race alongside humans. Pretty incredible!


For a while now, there's been a lot of aspiration and research into exoskeletons—robotic machinery designed to help people with muscle diseases or spinal injuries regain mobility.


Essentially, the person wears an exoskeleton suit, which enhances their movement and provides mechanical prosthetics.


We recently worked with a company called HaptX, which specialises in haptic technology.


There was a segment on a TV programme where they explored the future of haptics. They showcased how haptic gloves allow users to feel touch - using force feedback that responds to their hand movements.


They paired this with robotic arms in a controlled factory setting. There was a glass divider, and on one side, a human operator wore haptic gloves.


As they moved their hands, the robotic arms mimicked their movements in real time—picking up delicate objects like an egg.


It really brings forward the potential of robotic teleoperation. This is already happening in surgical procedures, but it opens up new possibilities for people to manipulate objects remotely with precision.

In that same article, they explored another use case—people with severe mobility restrictions.


Imagine someone who is cognitively sharp but physically unable to leave their house. They could work through a humanoid robot.



They interact with customers, joke with them, take orders, and send the information to the kitchen. The food is still physically delivered by the robot, but the human presence is very much there.


Two people in a cafe and a robot holding a tray with sandwiches

Courtesy of Dawn Avatar Robot Cafe


It’s fascinating to think about how these technologies can enable people who are often overlooked in society to participate, work, and engage with the world.


Just to reiterate—this is already happening in places like San Francisco, where driverless taxis are now part of the landscape. No one thinks twice about them anymore.


The possibilities are vast.



Let’s talk about location-based animation or VFX


Other examples of usage environments are on location-based animation or VFX, where you’re working with high-end AAA games that have a sophisticated narrative and experiences in which multiple people are tracked simultaneously while acting out scenarios. 


Motion capture actor holding a gun prop and hiding

These could be mythical or real-life but critically how they move in relation to each other adds realism.


Being able to track them with markers ensures you know exactly where they are and how they interact with each other. You get a more realistic representation of those actors or performers in motion capture AND the props they’re operating with - these could be sabres or swords, for example. 


Trailer Farm Shadow Warrior case study

You may be creating a game where the actor goes through a door, opens it, puts their hand on the handle, and pulls it open - the perception of the ‘weight’ of that action is really important.


Now, we may not necessarily build a complete wall with a door but we may have a standing door frame that we’ll track, together with the door handle. This still delivers the ‘real’ relationship between the human, the prop and the set. 


Other examples include real-time broadcast, such as news programmes, weather programmes, or sports broadcasts, where the presenters are moving in front of a green screen or a virtual LED backdrop.


Often, these presenters are being tracked with markers (as small as lapel microphones!). They may have spotlights and cameras tracking them simultaneously so the feeling is natural. The content then re-renders based on where the presenter is. 


There are also applications in TV and broadcasting film. Game shows are a good example where you view people in an environment that’s entirely virtual and green-screened. All the operations behind the scenes involve tracking cameras, participants, presenters, and performers.


We’ve worked in situations where part of the mechanism behind gameplay in film and TV programmes involves tracked equipment - a performer might be using tracked elements in a game show, which aren't perceptible to the viewer at home, but optical marker-based tracking makes it all possible.



Let’s talk sports, biomechanics and life sciences


Finally, there's a sector I haven’t discussed! Sports, biomechanics, and life sciences. People are utilising tracking equipment to discern either human mobility or how people interact with sports equipment. For instance, how tennis players hold the racket and how they spin the ball. 


Basketball case study

Golf is a big market where amateur golfers—or should I say, consumer golfers—want to improve their game. They use tracking equipment to analyse the quality of their swing, how much they turn their wrists, hands, or shoulders at certain points and how they can adjust to improve.


Sky Sports case study

Sports trainers or coaches use tracking systems to better understand and instruct athletes. The coach can play back pre-recorded data and point out areas of error and identify room for improvement. All of this is achieved using optical marker-based systems. This tracks the body, the arms, the club, the face of the club and in very high speed too. 



Let’s talk about the integration tracking systems with 3rd party tools


The final point I want to talk about is the integration of tracking systems with third-party tools. 


For example, you may have seen shows using projection mapping onto movable objects—these could be large or small, including people, or the sides of objects such as flags or sails. 


There could also be moving elements on stage - such as theatre.


There is an integration of optical marker-based tracking systems with projection mapping. The object’s movement dictates how the pixels from the projection mapping change based on the orientation and position. 


We’ve done projects, Bild Studios is a great example, where we demonstrate the capabilities of an MRI scanning manufacturer by projecting onto a person’s body to show the position of the organs.



In conclusion….


Marker-based tracking isn't just for VFX and animation, it’s revolutionising industries. From perfecting robotics to enhancing sports performance and surgical training, its impact is everywhere. Whether enabling virtual worlds or powering exoskeletons, this tech is shaping the future in ways we’re only beginning to imagine, and I'm excited to see where it goes.


 

Now you know what it is, see how we can design and deliver a marker-based tracking solution. Speak to our sales team at sales@target3d.co.uk or call us on (+44) 0203 488 2575 and take the first step.

Comments


Commenting has been turned off.
bottom of page