Related to data privacy there have been concerns on how organizations are storing and protecting sensitive data that they have gathered while their customers are using their digital products covering every industry and business sector.
I thought it would be good to hear what Rachel Maddow has to say about this, but I don’t have her on speed dial so I asked the ChatGTP version of Rachel:
“Good evening, I’m (deepfake) Rachel Maddow and tonight we’re going to talk about one of the most pressing issues of our time: data privacy. Our personal information has become a valuable commodity, with companies and governments collecting vast amounts of data about us every day. But who owns that data, and how is it being used? These are questions that have taken on even greater importance in recent years, as we’ve seen a number of high-profile data breaches and a growing awareness of the risks posed by unchecked data collection.
Let’s start with some of the concerns. One of the biggest is the issue of data breaches. In 2020 alone, there were 1,001 publicly reported data breaches in the United States, exposing over 155 million records. These breaches can result in financial losses, identity theft, and a loss of trust in the institutions that we rely on to protect our data. In some cases, the breaches have been caused by simple human error, such as failing to secure a database or misconfiguring a server. But in other cases, they’ve been the result of deliberate attacks by hackers, seeking to profit from the sale of stolen data.
Another concern is the use of our data by companies and governments. We’ve all heard the stories of how social media companies like Facebook have been collecting vast amounts of personal information about their users, often without their knowledge or consent. This data is then used to target ads or influence our behavior, sometimes with serious consequences. In one example from 2018, it was revealed that Cambridge Analytica, a political consulting firm, had used data harvested from Facebook to create targeted ads during the 2016 US presidential election. This raised serious questions about the role of data in politics and the need for greater transparency and accountability.
So what can be done to protect our data privacy? One answer is regulation. In recent years, there has been a growing push for greater privacy protections at the state and federal level. In 2018, the European Union implemented the General Data Protection Regulation (GDPR), which gives EU citizens greater control over their personal data and requires companies to obtain explicit consent before collecting or using that data. In the US, there are a number of pending regulations that aim to do the same. The California Consumer Privacy Act (CCPA), which went into effect in 2020, gives Californians the right to know what data companies are collecting about them and to request that it be deleted. And in 2021, the proposed Consumer Data Privacy Act (CDPA) would create a national standard for data privacy, giving Americans greater control over their personal information.
The year 2023 has begun a fundamental shift in the philosophy underlying data privacy laws in the United States, moving towards a broader “rights-based” approach, as exemplified by the European Union’s General Data Protection Regulation (GDPR). Instead of the “harms-prevention-based” approach, where privacy protections sought to prevent or mitigate harms in specific sectors, individuals now have the legal right to control their personal information. California’s lead is being followed by four other states – Colorado, Connecticut, Utah, and Virginia – in enforcing new GDPR-inspired statutes in 2023. More states are expected to follow suit. This shift represents a comprehensive approach to privacy protection, applying to businesses across numerous sectors, in addition to the sector-specific laws that remain in place.
Data privacy is a complex and important issue that affects us all. From data breaches to targeted ads, our personal information is being collected and used in ways that we may not always be aware of. But with greater awareness and stronger regulations, we can ensure that our data is protected and that our privacy rights are respected. Thank you for joining me tonight, I’m deepfake Rachel Maddow.”
I think related to data privacy and data authenticity. It is a concern in itself that ChatGTP can create an article that is so close to the original by crawling through media that features CNNs Rachel Maddow. Of course it is no comparison to seeing Rachel live in her show, but with the advances in technology also video processed deepfake technology is progressing on par including a video of Morgan Freeman letting you know that he is actually not a real being, but a artificial intelligence deep fake.
There are of course promises of this technology. Even I would like to have a clone or digital twin of yourself that can join conference calls and do repetitive work that is not creative and that I’d rather automate.
Also, I think there is a generational chasm where younger generations are more willing to provide access to their data to get additional benefits vs generations closer to the retirement age that hold information closer to their chest having experienced forms of identity theft in one form or the other.
There need to be checks and balances in place to make sure we can detect when an artificial assistant of virtual representation of a real human being is used. As usual regulations have to catch up and it comes back to ethical organizations to work with regulators to find a balance between the speed of innovation without leaving ethics behind.