I was recently watching the new video from Veritasium : Exposing The Flaw In Our Phone System where they exploited SS7 to connect different GTs (Global Titles) to hack cellular communications.
A key point raised by guest hacker Karsten Nohl touched on a significant privacy concern that I had been grappling with for the past few months. [Exact time stamp].
......Whether privacy intrusion is a problem for everyone individually? .... its almost a philosophical question, right? Somebody who grew up more in the Berlin tradition of the Chaos Computer Club like myself, strongly beliefs that privacy and the ability to ... form your own thoughts without being observed is a prerequisite for democracy.
But many other people would argue nothing to hide, nothing to fear....
One of the examples shown in the video was "The story of Latifa, Dubai’s fugitive princess" and how she was captured near Goa port, while fleeing to India [news link] possibly using the vulnearability in SS7 communication proptocol.
Latifa’s capture illustrates the chilling reality of how modern governments, especially authoritarian regimes, leverage technology to maintain control and crush dissent. In an era where digital connectivity is ubiquitous, privacy is more fragile than ever. Surveillance technologies are no longer restricted to intelligence agencies; they’ve become integrated into everyday life, transforming the world into an open-air prison for those targeted by the state.
The UAE, like many other countries, has embraced tools such as mass surveillance, biometric identification, AI-driven facial recognition, and geolocation tracking, often under the guise of national security. Such technology is a double-edged sword. While it promises efficiency and protection against crime, in the hands of autocratic governments, it can easily be used to stifle dissent, monitor citizens, and suppress democratic movements.
The case of Latifa shows how privacy-invasive technology can be weaponized against individuals seeking freedom. Governments have unprecedented access to personal information, making it nearly impossible to hide from the digital panopticon. Surveillance cameras dotting public spaces can recognize faces and track movements in real-time, while drones, satellites, and AI programs analyze vast datasets to predict or respond to threats—often at the cost of human rights.
Latifa’s story is not an isolated incident. Across the world, similar technologies are being used to undermine democratic movements and silence political opponents. In Hong Kong, facial recognition was employed during the pro-democracy protests, allowing authorities to identify and arrest protestors . In China’s Xinjiang region, invasive surveillance is a tool of repression against the Uighur population, with biometric data collection and AI systems flagging individuals for arrest[news article].
As technology becomes more sophisticated, the divide between personal freedom and state control grows thinner. Latifa’s capture is a harrowing reminder of this reality. She was not just running from her family; she was running from an all-seeing system designed to track, control, and eventually return her to captivity.
Her story serves as a stark warning for the future: when governments hold the keys to powerful surveillance tools, the potential for abuse is immense. As technology advances, so too must our vigilance in safeguarding privacy, human rights, and democracy. Otherwise, what happened to Latifa could become the norm for any who dare to dream of freedom.
Here are some Indian government policies that raise privacy concerns which were in recent headlines :
The DPDP Act allows wide-ranging exemptions for government agencies when processing data for reasons like national security, public order, and friendly relations with foreign states. It also mandates that citizens be informed of any data breach, but there’s concern over the lack of detailed protections, especially since publicly available data is excluded from its scope. Additionally, children's data is highly regulated, requiring parental consent for those under 18, raising questions about privacy and feasibility. (Carnegie Peace)(Business Today)
The mandatory linkage of Aadhaar (India’s national biometric ID system) with various services, such as bank accounts, SIM cards, and welfare schemes, is viewed as a potential violation of privacy. There are also concerns about how the biometric data is stored and used, especially in light of past data breaches[links]. The Supreme Court had previously restricted Aadhaar’s mandatory use, but government efforts to expand its reach persist.
The government's deployment of facial recognition technology (FRT) in public spaces raises privacy concerns, especially when used for law enforcement. India lacks robust laws to govern how this data is collected, processed, or stored, potentially leading to misuse and mass surveillance.
The Indian government’s increasing surveillance over telecommunications, including social media and encrypted messaging platforms, under the guise of cybersecurity and anti-terrorism measures, has been criticized as overreach. This is particularly visible in laws that allow real-time monitoring of online activities.[WhatsApp]
In some cases (which I am seeing more & more frequently) governments may strategically plant stories or create disinformation campaigns to sway public opinion and build trust in their policies. These actions, often described as “cyberbullying” in the digital sphere, exploit media platforms to manipulate narratives, discredit dissenters, and promote a favorable image. By shaping online discourse, governments can not only suppress opposition but also control the perception of legitimacy.
Example : Recently government wanted to "explore" banning Telegram App [Telegram Ban in India] which is a direct boon to students. Thats because its end-to-end encrypted, allows sharing massive amount of files across including pirated movies. Recently there was a news on how Telegram Bot was used to leak data. Chatbots are nothing but a quick way to interface a kind of Web form, like how a user can submit a query on a Google and get the data. Data was leaked seperaltely and a bot was created as an interface, but do take a note of how the news articles were posted.
[Economic Times] : Hacker uses Telegram chatbots to leak data of top Indian insurer Star Health.
[Business Standard] Hacker uses Telegram chatbots to leak data of Indian insurer Star Health.
[Indian Express] : Hacker uses Telegram chatbots to leak data of Star Health Insurance: Report.
[Business Today] : Stolen Star Health customer data exposed via Telegram chatbots, raising security concerns in India.
[MoneyControl] : Star Health customer medical records leaked on Telegram chatbots: Report.
Open Source Government Policies : Open-sourcing all government policies would involve making them transparent, accessible, and available for public review, including their development stages, implementation plans, and potential revisions. Similar to how everyone discusses and contributes to a open repository :
Open Data Platforms: All government policies, especially those affecting data privacy, cybersecurity, and surveillance, should be hosted on open data platforms. These platforms should provide access to the entire legislative framework, current amendments, and ongoing policy proposals.
The National Data Sharing and Accessibility Policy (NDSAP) is aimed at increasing access to government data, but such policies could go further to include legal texts and policies in progress.
Use of Open-Source Repositories: Governments can host policy documents, revisions, and public discussions on open-source platforms like GitHub. This allows version control, transparent contributions, and a history of changes made to the policy.
Governments could leverage platforms like GitHub for policy transparency in a similar way that France used open-source contributions for its Digital Republic Bill.
Transparent Algorithms: If policies involve the use of algorithms (e.g., for surveillance or welfare distribution), those algorithms should be made open-source, allowing independent experts to audit them for fairness and privacy compliance.
India has been critiqued for the opaque algorithms used in the Aadhaar biometric authentication system. Open-sourcing such systems would provide transparency into how personal data is processed and secured.
Independent Oversight Committees: Policies related to national security, privacy, and surveillance should be overseen by independent watchdog organizations that can review and report on them in real time. These organizations should have access to the full details of these policies, including the reasons for their development and any potential consequences for privacy.
Each policy or new law thats undergoing Government discussion should undergo a Threat Modeling so that it follows a well defined short and long term analysis to identify its impact blast radius, if things go wrong at all levels. The report should talk about contingencies for data access and when the approvals are met by manual approval with notification to higher ups to hold them accountable if anything goes wrong, avoiding plausible deniability. The Indian government did release the draft of the Digital Personal Data Protection (DPDP) Bill, 2023 for public feedback (Linklaters). Each policies driving factor should be analysed to ensure the rules are not specific to special section of the population. The report should be made public and allowed to be traceable on who worked on it, who explored the threats and proper audit to avoid. An automated template based solution which works in Tech industries can be coopted by the governments.
Open Government Partnership (OGP): India is not part of the Open Government Partnership, a multilateral initiative that promotes transparent, participatory, and accountable governance. Joining such initiatives could help India align with global standards for open governance and policy-making.
We could tighten the scope of government exemptions, especially in cases where "public order" or "national security" could be exploited. Clearer definitions of terms and better regulation of how public data is handled would reduce potential misuse. Lower the age requirement for parental consent (as seen in the GDPR, where it can be as low as 13 or 16) to balance protection with practicality.
Access Control : Contingent Authorization : Establish clear guidelines for the use new technology with scopes, ensuring transparency about its deployment and proper oversight mechanisms. Consent should be required before collecting biometric data unless for public safety in specific cases.
Adopt a framework similar to Europe’s GDPR that allows data collection only when strictly necessary and ensures judicial oversight before accessing personal communications. Clearer limits should be placed on real-time monitoring, with enhanced accountability for agencies using these powers