Has a company used technology to breach your data rights? If so, you could be due compensation. If you are a victim of a data protection failure, check out our current group actions below to see if we are running a claim related to that specific breach.
Companies collect and use personal data to make vast sums of money. But the misuse of this data can be devastating.
WorkerTech is harming workers in the gig economy. Personal data is being misused to target individuals via social media and exploit their vulnerabilities. Personal credit data is used to make unfair decisions about people. And algorithmic systems are routinely discriminating against women and ethnic minorities.
The ugly truth is that big companies are collecting and misusing personal data in many different ways. But you can claim compensation if a company has breached your data protection or privacy rights.
Data privacy and protection has never been more important or more vulnerable. Defending the rights of individuals, if technology has been used in a manner that has caused you harm, we can help you get compensation. In some cases, this could be as much as £5,000 – £20,000.
Keller Postman UK is a group action law firm. With a group action claim, you and the other victims join together and fight to get compensation. Group actions can be a powerful tool and can have a bigger impact than a single claim.
If you are a victim of a data protection failure, check out our current group actions to see if we are running a claim related to that specific breach.
If you are involved in a potential group action not listed below, please contact us and tell us about it! Where enough people come forward, we may launch a new claim.
We do not take on individual cases.
This includes where:
Don’t let the fear of costs stop you getting the justice you deserve. Contact us today and let us fight for you.
Big organisations rely on user data to make money. But data is a form of currency, and your consent is required before it is monetised.
Organisations collect huge amounts of personal data and can use it to make adverse and unfair assessments about us (e.g., about our financial or medical future).
Gig economy employers use WorkerTech to ensure managerial oversight. But these technologies often don’t work as well as they claim to, and this is costing people their livelihoods.
We often share our data with companies in return for services and products. But all too often, this data is being passed to third parties.
Despite strict data protection regulations, many organisations have inadequate security measures in place. So private information is routinely falling into the hands of cybercriminals.
CCTV, mass surveillance technology, and smart devices could be breaching your data privacy rights.
The government, the police, and private businesses use technology to fight crime. This includes CCTV and live facial recognition (LFR). But there are ethical and privacy concerns over using such tech – especially when it comes to profiling and automated decision-making about individuals. The potential impact on our data privacy rights is so concerning that the UK’s data protection regulator has warned over reckless and inappropriate use of such technology in public places.
Some of the biggest companies in the gig economy sector use AI, algorithms, and facial recognition tech. But these technologies are costing people their livelihoods and discriminating against women and BAME groups. Increased homeworking has also increased monitoring systems, as companies use dehumanising surveillance tools to track their employees and gig-economy workers. Because of failures in algorithmic management tools, some gig economy workers have been locked out and left unable to work, or even fired – all through no fault of their own. And with automated processes in place, many are left without the right of appeal.
Professional sportspeople, passionate amateurs, those who like to keep active, and people looking to get fit, could have had their data exploited by wearable devices. The data captured can include identity, location, and health status, which creates obvious privacy and security risks. For example, what would happen if an insurance company got hold of your data and increased your premium?
Criminals are targeting the cryptocurrency market – with the equivalent of millions of pounds being stolen from cryptocurrency holdings each year. Cryptocurrency fraud sometimes happens after a data breach, as criminals use the data exposed in breaches (often usernames and passwords) to access a person’s online accounts.
We all hand over personal information in return for services. But there are concerns over whether App providers can protect our data privacy, and who they share our data with. According to researchers at the University of Oxford, the number of Android apps harvesting user data and feeding it back to parent company Google is “out of control”.
Many of us have an active online life. But are you confident in your social media security? As well as the threat of social media fraud and data breaches, private information has been used to manipulate voting behaviour and sell products. We must understand how our information is used and the impact on our privacy.
Smart devices have the potential to deliver enormous benefits. But your exposure to data harvesting and breaches depends on the number and type of smart devices you own, so it’s essential to understand the risks.
Partner Kingsley Hayes discusses in Law360 the High Court’s judgment in the Prismall v. Google case and its future implications for data privacy group litigation. Kingsley’s article was published in Law360, 27 June 2023, and can be found here. Introduction
Head of Data and Privacy Litigation, Kingsley Hayes discusses the regulatory lacuna surrounding the use of live facial recognition in the UK in Computer Weekly. Kingsley’s article was published in Computer Weekly, 8 June 2022, and can be found here.
Elizabeth Denham, the UK Information Commissioner, has said she is “deeply concerned” about the use of live facial recognition (LFR). Commenting in a blog post, Ms Denham addressed privacy worries over the use of live facial recognition technology in public places.