Security vs. privacy
Security and privacy are two topics that are hardly mentioned in one breath in daily security operations. On the one hand, this is a good thing because when tactical and strategic thought is given to safeguarding privacy within security, the operation can run freely within those set frameworks. On the other hand, protecting privacy within security can be a slippery slope. In my experience, the latter is the rule rather than the exception, which means that an unnecessary amount of "repair" work has to be done afterward, with all the associated costs.
Macro vs. micro
Security and privacy seem to serve conflicting interests when it comes to data collection; whereas "security" wants to be able to use as much data as possible to protect the interests of an organization, "privacy" basically wants to collect as little data as possible, especially when the data concerns individuals. How the balance between these two extremes is established is highly dependent on the applied environment. Just think of the United States, where people dare to make very different concessions in the field of privacy than in Europe for the sake of "security." But even in organizations, you see significant differences; I can point to organizations where no one cares, but also organizations where a decision-making process about the implementation of security monitoring can take more than a year, and the Works Council takes an active role in the dialogue.
Conflict of interest?
Of course, the point of view of privacy about data collection is a lot more nuanced than "as little as possible." However, at a time when we are increasingly trying to monitor people’s behavior in the field of security to be able to recognize deviant behavior, the object of security monitoring is becoming more and more the individual. We can count ourselves fortunate that the effectiveness and proportionality of the monitoring will then be critically examined. That should be business as usual, although I rarely see that in practice.
Consult with each other
When new security technology is implemented, a Data Privacy Impact Assessment (DPIA) is increasingly carried out before or at the start. This is an excellent opportunity to discuss and record efficiency, proportionality, and possible mitigating measures with various stakeholders, such as "security" and "privacy." Sounds great, but because of the conflict of interest, "security" benefits from implementing the desired technology with as little "hassle" as possible. After the DPIA is approved, who checks whether the implemented data complies with the agreements laid down in the DPIA in practice? What if the DPIA has been worked out based on strategy and tactics while the operation is "at the controls," thus determining the configuration, and perhaps not even aware of the DPIA? Do you periodically evaluate the DPIA to ensure that it is in line with the latest functionalities and configurations, or are you already happy from a security point of view that the DPIA book remains closed once and for all?
Case study: endpoint detection & response
In 2024, endpoint detection and response (EDR) technology is the most effective way to protect against, e.g., ransomware. When organizations move to implementation, it is seen and presented as the replacement for the more traditional antivirus (AV) technology. And that"s dangerous:
IT product vs. security product. AV has traditionally been an IT product implemented and monitored by IT. EDR, on the other hand, is a security product implemented by IT and often used by "security." There is a reason for this: the possibilities of EDR technology are enormous compared to regular AV technology and require specific security expertise. The downside of these possibilities is that the privacy implications of EDR technology can be many times greater. Examples include collecting all visited URLs in a web browser and keeping track of all open files in a workplace.
When EDR is seen as an IT product, it is much less obvious to perform a DPIA than when it is seen as a security product. A parallel: A next-generation firewall offers functionality similar to EDR in specific areas. In organizations that value privacy, security products are under a magnifying glass, while the firewall, seen as part of the IT infrastructure, escapes attention.
Privacy implications should only be considered after the fact. It would not be the first time that attention has been paid to the privacy implications of EDR only after implementation. In such situations, it is hoped that a DPIA has been carried out and that its need is explained well to all employees. For the implementation of typical IT products, it is not apparent to look for the conversation along the privacy axis, especially in smaller organizations. If you only think about the privacy implications after the fact, you are too late; on the one hand, because as "IT" or "security, you then have all appearances against you, and on the other hand, because not every EDR product lends itself equally well to the granular configuration of the privacy implications per product. It would be annoying if the product you have implemented does not fit with what you would like to set up from a privacy point of view.
Lack of executive engagement. Especially when technology goes beyond IT, especially when it has privacy implications, the organization's interests must be weighed against the interests of the individual. A strong leadership that can explain why specific cybersecurity measures are essential for the organization is indispensable. If this is not the case, it is often IT (that has carried out the implementation) and possibly "security" that has to defend the importance of EDR technology for the organization. And that is difficult, perhaps impossible if management does not support you.
Security = people + process + technology
This well-known combination of elements describes it quite correctly: implementing technology also has a human and process element. People, in particular, are often forgotten, especially when it comes to safeguarding privacy. Especially in Europe, where the individual's privacy is considered a lot more critical than in many other parts of the world, we need to talk to each other. Not afterward, but in advance. Not with the idea that we have to "cross a hurdle" but constructively to achieve the best result for the organization and individual.