Blog

AI-generated unfair dismissal claims swamp Fair Work Commission

Neural Notes: F1 drivers are walking datasets. Are regular workers next? – How It’s Trending in 2026

0

Welcome back to Neural Notes, a weekly column where I look at how AI is influencing Australia. In this edition: who owns the data your body produces at work?

The Melbourne GP is coming up, and one thing that has been increasingly prominent as the sport has become more tech-focused is that a driver’s body is part of the equipment.

Drivers go to work wired. Heart rate, blood oxygen, stress responses, and even eye movements are streamed from gloves, suits and simulators into proprietary team models. These days, AI plays a large part in that.

Engineers layer that with thousands of telemetry signals from the car to refine race strategy, training programs and the actual cars.

If a driver changes teams or retires, the biometric and performance datasets built around them typically remain within the organisation’s systems. They form part of the competitive archive that informs future strategy and development, rather than something a driver simply takes with them.

F1 makes this dynamic visible. It’s written into contracts, competitive strategy, and safety protocols.

Regulators such as the Fédération Internationale de l’Automobile (FIA) have also moved to issue guidelines on how biometric data can be collected and used. This includes when it appears in broadcasts or is used for marketing and entertainment purposes.

Outside sport, we tend to assume there’s still a line between a ‘work product’ and ‘the person who produced it’. An employee’s spreadsheet belongs to the company. But their nervous system does not.

Once you, as an employer, start instrumenting people, that line becomes decidedly blurred. And it’s already happening in Australia.

When a workplace wants body data

The most famous example is Jeremy Lee, a sawmill worker in Queensland who refused to hand over his fingerprints for a new attendance system. As a result, he was fired.

Related Article Block Placeholder

Article ID: 331931

AI-generated unfair dismissal claims swamp Fair Work Commission

In 2019, a full bench of the Fair Work Commission found the dismissal unfair and treated fingerprint data as sensitive information under the Privacy Act, meaning it could not be collected without consent.

The decision also highlighted the complexity of the employee records exemption. While employers may rely on that exemption once records are held, the collection of sensitive information still requires valid, voluntary and informed consent.

Since then, the technology has only spread. Warehouses, abattoirs and construction sites use fingerprint or facial scanners for access and time-and-attendance. Some hospitality venues have adopted similar systems. Legal updates increasingly refer to “emerging issues” in employment law as biometrics are bundled into safety, security and payroll tools.

Add AI, and the picture becomes more complex.

A report by the United Workers Union on surveillance in Australian workplaces describes staff being tracked through cameras, keyloggers, GPS systems, productivity dashboards and algorithmic rostering. Sometimes this occurs with little transparency and few meaningful options to opt out. Biometric authentication and “wellbeing” wearables are also starting to appear in that mix.

It doesn’t look like a pit wall full of engineers watching a live feed from a world-class driver. But the logic is similar. If the organisation can measure something about your body while you’re on the job, it is increasingly treated as something that can be captured, stored and fed into its systems.

The question is, who owns the data your body produces at work?

An employee’s body, an employer’s AI dataset?

Interestingly, biometrics don’t fit neatly into the usual rules.

On one level, they are just entries in a database. On another, they are extensions of the body in digital form. You can change a password. You cannot easily change your fingerprints or facial geometry. Once those templates leak or are repurposed, you do not get to reset them.

Australian privacy law recognises this in theory. Biometric identifiers and certain health data are classified as sensitive information, attracting higher thresholds for consent and security. In practice, employers often rely on the employee records exemption and the reality that most workers will not challenge a fingerprint scanner.

Related Article Block Placeholder

Article ID: 331487

Neural Notes: Inside Arena, the unofficial scoreboard for the AI model wars

The Lee case exposed this issue. The Commission accepted that a fingerprint is sensitive information and that valid consent must be voluntary and informed, not implied simply by turning up to work. At the same time, once an employee record is lawfully held, the exemption can limit how the Privacy Act applies to its subsequent handling.

The result is a messy middle. Biometrics are treated as special when someone pushes back, and as routine infrastructure the rest of the time.

This becomes all the more tricky when you add AI.

When employers train systems on staff behaviour, voice data, biometrics or performance patterns, those systems do not simply disappear when someone resigns.

Even if an employer deletes your file, the model weights derived from you carry on. While a body walks out the door, the dataset probably doesn’t.

The reassuring answer is for everything to be consent-based. In an ideal world, the system would only be as invasive as we collectively allow it to be.

Unfortunately, reality is more complicated.

Workplace surveillance inquiries in Australia have already pointed out how thin “consent” looks when the alternative is losing your job, or being labelled “non-compliant” in a tight labour market.

Biometric collection in employment settings rarely feels like a negotiation between equals. In many cases, it is presented as a condition of entry to the building, to the shift, to the payroll system.

People intuitively recognise a difference between monitoring what they do at work and capturing permanent, body-linked identifiers as the price of participation. But the legal and technical frameworks do not always draw that line clearly.

But this doesn’t even have to be that dramatic. The normalisation of recording meetings and the use of AI work tools quickly leads to complacency as well.

What kind of workplaces are we building in the age of AI?

This is not an argument against safety tools, fraud prevention or performance analytics. In genuinely high-risk environments, knowing someone is about to black out or that an unauthorised person is accessing a hazardous site can matter.

In Formula 1, the entire enterprise is about operating at the edge of human and mechanical capability. Instrumentation is part of the job.

The question is what happens when those edge-case tools become default infrastructure. When “modern management” quietly comes to include treating staff as continuous sources of biometric and behavioural data. When the easiest response to any operational problem is to add another sensor.

Related Article Block Placeholder

Article ID: 330965

Neural Notes: Why OpenClaw and Moltbook are raising serious security red flags

F1 is at least transparent about the role of data in competition.

Most workplaces are less explicit. They keep the language of culture, care and convenience while rolling out systems that make people increasingly transparent to management and vendors.

Drivers have gotten used to going to work wired. Regular workers may need to decide how far they are willing to go before their bodies become just another permanent dataset inside an organisation’s systems.

And if it isn’t already, this should become an important consideration for employers.

[ad_1]

#Neural #Notes #drivers #walking #datasets #regular #workers #trending #[now:year]

Leave a Reply