Beijing has big plans to control big data. The tension has been building in the past decade, reaching a fever-pitch in recent months. Shenzhen-based BGI was accused by the Trump Administration of “collecting, storing and exploiting biometric information from COVID tests”, essentially farming U.S. citizens’ data during the pandemic. More recently, ongoing revelations about data harvesting of personal information from users of the Chinese-owned TikTok have brought the government’s data practices under scrutiny.
As private Chinese companies come under fire for shady data protection policies, so does the Chinese Government for insufficient protection. July’s data leak stemming from the Shanghai Police represented “perhaps the largest data loss in the world,” according to Forbes, and has resulted in an eruption in private Chinese citizen data for sale, a Bloomberg article revealed.
Tensions continue to climb. In an increasingly virtual world — digital footprints have expanded exponentially since the pandemic — big data, or any dataset that holds confidential information or intellectual property, is the next asset to be weaponized. Each leak has highlighted the critical need for stringent cybersecurity protection, robust data practices, and responsible governance when it comes to managing data for any purpose: governmental, marketing, research studies or otherwise. In this context, data lake, database and storage and backup site security must be paramount for any organization operating on US or Chinese soil.
Governments funnel millions into cybersecurity protection; why are databases still exposed?
It’s a double edged sword: yes, governments spend millions to fortify their digital perimeter, but they invest at least a comparable amount of energy into nation-state attacks. The United States National Security Agency is reported to have tapped into Chinese telecommunications networks via an email phishing attack of a leading university, opening a route into their suppliers. Sensitive data, network equipment, administrative passwords and file-transfer protocol data was stolen.
The NSA hack highlights what we already know on why data security is so contentious. Remote work after the pandemic has led to a rise in remote access sites, which NSA hackers targeted with their email phishing attack. Mergers, acquisitions and expanding supply chains mean an organization’s digital perimeter expands far beyond the four walls of their network, increasing the external attack surface and the likelihood of exposed and unknown assets significantly. Note how the NSA gained access: the telecommunications company was a carrier for the university.
Attack aside, the world we live in is undergoing a massive digital transformation. Organizations are transitioning to the cloud, increasing their reliance on third party vendors and decentralizing data storage. Whereas before, asset inventories were neatly centralized within IT teams, marketing, engineering and sales teams now create their own assets for testing and development purposes, rendering central control of asset inventory near useless. As a result, assets slip through the cracks — transforming into easily exploitable opportunities for cyber assailants.
The simple answer is: databases are modern treasure chests. They’re simply too valuable to ignore; and thus, effort will always be put into acquiring data by any means necessary.
When weaponized data is a main component of cyber warfare, lean on best practice to defend data
Encrypted data, best practice configuration, regular backups, and separation of servers and foresight are the most effective ways to prevent attacks and protect data before it’s exploited. Typically, databases are stored off public IP to mitigate attack risk. It’s acknowledged best practice across cybersecurity professionals; but for one reason or another, that often isn’t the case. Database admins often use a public IP during remote connection, often without using VPN, for maintenance work. This leaves public IP open via misconfiguration. Without VPN, MFA, SSO, proxy protection or data encryption, valuable corporate information, intellectual property and privacy data could all be lost to the work of malactors, which is concerning when recent findings uncovered 270,000 exposed databases with Chinese IP addresses in July, and over 1.1 million vulnerable databases with American IP addresses.
Related exposures like unmaintained web servers (eg. Apache, NGINX and IIS), also open the door to network attacks that could result in a data leak via remote code execution (RCE) or man in the middle attacks. Similar to databases, it’s critical that storage and backup sites also be stored off public IP so opportunists will have hard tamper with corporate data, or leverage ransomware for profit.
An increased reliance on cloud computing also raises risk. Contrary to popular belief – the onus to product cloud data falls on the customer, not the service provider. Corporations make themselves vulnerable through cloud service providers in a multitude of ways: incomplete control over who can access sensitive data, cloud misconfiguration, cloud applications provisioned outside IT visibility, lack of staff with skills to manage security for cloud applications — among others.
Could the spike in data breaches stem from irresponsible data practice?
Regardless of an organization’s position as either public or private sector, the same rules apply: encrypt your data, protect your databases off public IP, secure via proxy, and follow best practice to ensure maximum security for your data. Keen to take things one step further? Deploy external attack surface management solutions to detect vulnerable and unknown assets before they’re exploited.