I am often asked for my LinkedIn profile, or indeed why I am not on LinkedIn more visibly or actively.
My career has shown me that everything we do – online and frequently offline too – leaves digital footprints and data trails, which we either forget about or lose control of.
I make a very conscious effort to try to protect my personal privacy, and that of my family – including my 2 primary school aged children.
Consequently, I have only the barest minimum presence on Microsoft’s LinkedIn social networking service.
To explain in more detail –
Our online and offline lives leave data trails and electronic footprints almost everywhere. These breadcrumb trails last forever, as the old adage tells us, “the internet never forgets”.
These fragmented pieces of information are collected, organised and compiled, sold and bought, and aggregated, outside of our sight and beyond our awareness.
The mostly unregulated industries who operate in this space use these data sets to build forensically accurate and deep profiles on each of us, which provide alarming levels of detail regarding:
- who we are – both as individuals and as part of our families
- who we know and interact with
- what we like and don’t like
- what we do
- where we go
- what we buy
- our strengths, weaknesses, hopes and fears
These detailed profiles are bought and sold by data brokers and information aggregators, and the data sets can be combined and re-used with very few limits or safeguards on who uses them and for which purposes.
Social networking platforms are amongst the richest sources of this hyper-personalised profiling data. It’s also gathered from, for example, things like:
- online browsing
- search engine queries
- purchases
- public records
- operating systems and apps on our devices
- conversations we have with cloud-based LLMs / AI chatbots like ChatGPT
This data collection is rarely fully explained up-front, the services which gather it don’t always clearly ask for permission, and it’s often hard to know exactly what information they have about you.
For the most part, the main “benefit” to us as private individuals is to be shown better adverts.
It’s a largely invisible industry that’s becoming powerful and complex, which has significant ethical implications. The existing laws fail to adequate protecting us, and don’t meaningfully prevent the abuse of the data held about us.
It also raises concerns about privacy, fairness, and the potential for being manipulated or discriminated against, because these profiles can be used against you in real world situations – like, whether you get a credit card, mortgage, or other financial service, the price you pay for insurance or the probability of an insurance claim being permitted, or the types of goods and services you are permitted to be provided with.
Here is a summary of the issues I am concerned about:
- Data Broking is Pervasive
- The industry collects information from numerous sources (public records, online activity, apps, etc.), packages it, and sells these “insights” to other businesses for targeted marketing and more.
- Analytics are Key
- Data broking isn’t just about collection; it’s about analysing and matching data sets using statistical modeling, machine learning, and data mining to create broader profiles of us and our lives.
- Lack of Awareness is a Problem
- Most individuals are largely unaware of the extent to which they are profiled by data brokers and aggregators, leading to a significant power imbalance.
- Ethical Concerns are Significant
- Ethical issues include lack of meaningful consent, potential for discrimination, manipulation through targeted advertising, privacy violations, and security risks.
- Transparency is Limited
- The data aggregation industry operates with a high degree of opacity, making it difficult to understand data sources, algorithms, and how decisions are made.
- Current Safeguards are Insufficient
- Existing legal and technical safeguards (like privacy laws such as GDPR, and data de-personalisation techniques) have significant technical and regulatory limitations, and are trivially worked around.
- Discrimination Remains a Real Risk
- Businesses can aggregate data and use it to profile individuals in a discriminatory manner, even with existing anti-discrimination laws, and proving the intent to do so is difficult, and not well-tested in courts.
- Industry Self-Regulation is Weak
- Voluntary guidelines and best practices lack strong enforcement mechanisms and often don’t address ethical concerns.
- A Multi-faceted Solution is Needed
- Effective safeguards require a combination of stronger regulations, increased enforcement, independent audits, transparency requirements, and public education.
- The Situation is Developing at Pace
- The field of data privacy is constantly evolving, with technological advancement requiring continuous adaptation and vigilance. The need to balance innovation, economic growth, and individual privacy remains central to the discussion.
Bearing all of these things in mind, and as both (1) a keen and enthusiastic technologist and (2) a believer in the fundamental human right of privacy – I have chosen to avoid the use of social networking services as far as is reasonably possible.
Where I have to use them – such as Microsoft’s LinkedIn service – I provide the least information I can.
Thanks for reading.