• Metacurity
  • Posts
  • Best Infosec-Related Long Reads for the Week of 3/16/24

Best Infosec-Related Long Reads for the Week of 3/16/24

Crypto startups are scrimping on security, Fake streaming music accounts pump up royalties, The problems with post-quantum cryptography, American barriers to TikTok-like foreign surveillance go back to 1787, Social resistance to app overuse clashes with civil liberties in TikTok ban, TikTok's Project Texas is dead

Metacurity is pleased to offer our free and premium a weekly digest of the best long-form (and longish) infosec-related pieces we couldn’t properly fit into our daily crush of news. So tell us what you think, and feel free to share your favorite long reads via email at [email protected].

Image created by Playground v2.5.

North Korean Hackers Stalk Crypto Startups Scrimping on Security

Bloomberg’s Anna Irrera and Olga Kharif examine how crypto startups strapped for cash after a prolonged funding drought have cut security spending, leaving them vulnerable to hackers even as crypto thefts, particularly from North Korea, soar.

Despite the high stakes, many firms find themselves having to make tough choices. While there isn’t any data tracking code-auditing spending by crypto firms, executives at outfits that provide such services say demand has cooled.

Even after the cost of a typical crypto audit dropped roughly 50% since 2022 to around $20,000 per week, according to several firms, “projects are still unable to afford that,” said Hind Kurhan, who in September founded security auditing firm Thesis Defense and aims to establish an industry standard for audits.

At crypto-auditing startup Halborn, Chief Executive Officer Robert Behnke said “inbound interest” dropped 60% last year. Rates for auditing a type of smart contract built on the Ethereum blockchain fell as much as 20%, he said. Diligence, the auditing arm of ConsenSys, has seen the waiting time for its security screenings shrink.

Some companies forgoing labor-intensive manual code audits in favor of using less-precise automated tools to scan for weaknesses, security experts say.

One Man’s Army of Streaming Bots Reveals a Whole Industry’s Problem

Wired’s Morgan Meaker uses the case of a Danish man sentenced to eighteen months in prison to delve into a little-covered arena of online fraud, fake or artificial music streams, in which music makers create multiple fake accounts to stream music and generate plays to boost their royalties.

Fake or “artificial” streams are a big problem for the streaming industry. Between 1 billion and 3 billion fake streams took place on popular music platforms in 2021, according to a study by France’s National Music Center. Fake streams are a problem, according to the music industry, because they divert royalty payments away from real artists and pollute streaming platforms’ data.

“This is an example of a problem that's becoming a liability within the music industry,” says Rasmus Rex Pedersen, an associate professor in communication at Roskilde University in Denmark, who researches music streaming. “The streaming services have had several years to develop tools to combat this type of fraud and apparently they haven't been doing a very good job.” There are still services advertising sales of fake streams, he adds.

In February, a court in the Danish city of Aarhus heard how the man, whose name was withheld, was accused of using bots to generate a suspiciously high number of plays on 689 tracks, which he had registered as his own music. In one week, 244 music tracks were listened to 5.5 million times, with 20 accounts responsible for the majority of the streams. The defendant had previously argued these playbacks were linked to his job in the music industry. He plans to appeal, his lawyer Henrik Garlik Jensen told WIRED.

The man created software that played the music automatically, claims Maria Fredenslund, CEO of the Danish Rights Alliance, which protects copyright on the internet and first reported the case to the police. “So he didn't really listen to the music. No one really listened to the music.” According to the Danish Rights Alliance, the defendant had 69 accounts with music streaming services, including 20 with Spotify alone. Due to his network of accounts, he was at one point the 46th highest-earning musician in Denmark.

While the defendant created much of the music himself, 37 tracks were altered versions of Danish folk music, where the tempo and pitch had been changed, adds Fredenslund, who attended court.

Post-quantum cryptography is too damn big

Censys cofounder and Google product manager David Adrian explains how the NIST Post-Quantum Cryptography competition, which is sifting through candidates for key exchange and signatures needed to protect the web when quantum computing arrives, is not good enough to deploy on the public web because the proposed key encapsulation mechanisms (KEM) are too big for most browsers.

The current breakdown of key and signature sizes in TLS is roughly:

Root certificates often contain RSA keys, as do intermediate certificates. Root certificates are predistributed, and intermediates are provided by the server, alongside the leaft certificate. An RSA intermediate certificate has a 4096-bit (512 byte) signature, and a 2048-bit (256 byte) public key.

An ECDSA leaf certificate has a 32-byte key and a 256-byte RSA signature from the intermediate.

The handshake contains a 64-byte ECDSA signature.

Each SCT contains a 64-byte ECDSA signature.

In total, this is 512 + 256 + 256 + 32 + 64 + 2*64 = 1,248 bytes of signatures and public keys in a normal TLS handshake for HTTPS. Of the winning signature algorithms from the first NIST PQC competition, ML-DSA (Dilithium) is the only signature algorithm that could be used in the context of TLS and it has 1,312-byte public keys and 2,420-byte signatures. This means a single ML-DSA public key is bigger than all of the 5 signatures and 2 public keys currently transmitted during an HTTPS connection. In a direct “copy-and-replace” of current signature algorithms with ML-DSA, a TLS handshake would contain 5*2420 + 2*1312 = 14,724 bytes of signatures and public keys, an over 10x increase.

Barring a large-scale quantum computer staring us in the face, this is not a tenable amount of data to send simply to open a connection. As a baseline reality check, we should not be sending over 1% of a 3.5" floppy disk purely in signatures and public keys.

In more concrete terms, for the server-sent messages, Cloudflare found that every 1K of additional data added to the server response caused median HTTPS handshake latency increase by around 1.5%. For the ClientHello, Chrome saw a 4% increase in TLS handshake latency when they deployed ML-KEM, which takes up approximate 1K of additional space in the ClientHello. This pushed the size of the ClientHello greater than the standard maximum transmission unit (MTU) of packets on the Internet, ~1400 bytes, causing the ClientHello to be fragmented over two underlying transport layer (TCP or UDP) packets2.

Critics of the TikTok Bill Are Missing the Point

In the Atlantic, Zephyr Teachout, professor of law at Fordham Law School, argues that the outrage over a potential TikTok ban is misplaced, given that erecting barriers to foreign government surveillance and political interference has been an essential feature of American self-government. since the very founding of the country.

During the Constitutional Convention in 1787, the Framers were quite worried that foreign powers would exploit America’s open form of government to serve their own interests. At the time, the United States was small and weak compared with the powerhouses of France and England, and the Framers feared that favors and financing could seduce officeholders. Alexander Hamilton cautioned that “foreign powers also will not be idle spectators. They will interpose, the confusion will increase, and a dissolution of the Union ensue.” The Constitution therefore forbids foreigners from running for Congress until they have been U.S. citizens for seven years, and famously prohibits anyone but a natural-born citizen from being president. Elbridge Gerry, the great champion of the Bill of Rights, argued at the Constitutional Convention that “foreign powers will intermeddle in our affairs, and spare no expence to influence them. Persons having foreign attachments will be sent among us & insinuated into our councils, in order to be made instruments for their purposes. Every one knows the vast sums laid out in Europe for secret services.”

Even the treaty-ratification rule in the Constitution, which requires a two-thirds congressional vote, was included in order to reduce “the power of foreign nations to obstruct our retaliating measures on them by a corrupt influence,” as James Madison put it. And as we all learned during the Trump presidency, Article I of the U.S. Constitution forbids federal officials, without a special dispensation from Congress, from receiving gifts or emoluments from foreign governments. (I was a lawyer on the emoluments lawsuit against Trump, which had overcome preliminary legal challenges when he lost reelection.)

After the Constitution was ratified, Congress regularly used limits on foreign ownership and influence as a mechanism of preserving sovereignty, democracy, and national security. The limits are most pronounced in areas that affect politics, elections, and communications. Foreign nationals who are not green-card holders cannot contribute to political campaigns. Under the Foreign Agents Registration Act, lobbyists for foreign governments are far more strictly regulated than other lobbyists. The law, passed in the run-up to World War II, was strengthened after hearings in the 1960s revealed the degree to which foreign money was influencing domestic policy.

Other laws limit foreign control of different forms of infrastructure. The Defense Production Act authorizes the executive branch to block proposed or pending foreign corporate mergers that threaten national security. Vessels transporting cargo between two points in the United States must be U.S.-built and U.S.-owned. Certain defense contracts cannot be awarded to foreign-government-controlled companies unless specifically authorized by the secretary of defense. The Federal Energy Regulatory Commission can issue licenses for constructing dams or transmission lines only to U.S. entities, and geothermal lessees have foreign-ownership limits. As the Vanderbilt University law professor Ganesh Sitaraman has argued, the body of law limiting foreign ownership in various sectors can mostly be understood through the lens of platform regulation: They prevent foreign governments from taking over core elements of infrastructure.

This includes communications infrastructure. Limits on foreign ownership have been a part of federal communications policy for more than a century. The Radio Act of 1912 was the first federal limitation on ownership of communications infrastructure, forbidding foreign ownership of radio stations. It expanded and set a blueprint for later communications rules—Rupert Murdoch, for example, had to become an American citizen to avoid Federal Communications Commission rules banning foreign owners of American TV networks—which were based on the twin fears of espionage and propaganda. TikTok, of course, falls right at the intersection of those fears.

The Misguided Attempt to Control TikTok

The New Yorker’s Jay Caspian King analyzes some First Amendment implications of banning TikTok against growing public resistance to social media app overuse, arguing that protecting free speech is worth it, even if we hate the scourge of constant screen time.

Although Congress’s TikTok legislation is based partly on fears about data collection, both that bill and the Snap case suggest that the budding resistance to social media will inevitably clash with civil liberties. Social media is now the public sphere. Yes, the major social-media apps are owned by private companies, but, when North Carolina tried to bar sex offenders from using Facebook and other social-media sites, the Supreme Court ruled, in Packingham v. North Carolina, that the government could not restrict people from using “what for many are the principal sources for knowing current events, checking ads for employment, speaking and listening in the modern public square, and otherwise exploring the vast realms of human thought and knowledge.” Last year, as state legislatures across the country were drafting bills to place age restrictions on social-media platforms, I wrote that these laws, though understandable in spirit, were simply too unconstitutional to consider, especially given their unrealistic and clunky enforcement mechanisms, which would have gone well beyond just keeping kids off the platforms.

Setting aside the data-collection issue, the effort to bar TikTok is somehow even more unconstitutional. Another Supreme Court case is relevant here. In the sixties, a Socialist philosopher named Corliss Lamont was waiting for the delivery of the Peking Review, an explicitly Communist publication from China. At the time, the United States Postmaster General complied with a rule that dictated that any piece of mail from a foreign country that had been flagged as “Communist political propaganda” would be intercepted and set aside. The addressee would be mailed a card notifying them that the foreign Communist propaganda was waiting for them and would be destroyed if they didn’t send back a query card within twenty days affirming that they had indeed ordered the propaganda and would still like it to be delivered.

Lamont sued the Postmaster General, arguing that the stoppage of his mail and the requirement to put himself on a list violated both his First and Fifth Amendment rights. A year later, in 1965, the Supreme Court ruled that American citizens had a “right to receive” information, even if it was foreign Communist propaganda. Writing in the Times last year, Jameel Jaffer, the executive director of the Knight First Amendment Institute at Columbia University, noted that the Lamont case, along with Packingham v. North Carolina, left “no question that government action whose effect would be to bar Americans from using a foreign communications platform would implicate the First Amendment.” If Americans have the constitutional right to receive explicit foreign propaganda through the mail without even having to deal with the inconvenience of filling out a reply card, presumably they also have the right to receive whatever propaganda gets smuggled in through TikTok’s endless reel of dancing teen-agers.

What Happened to TikTok’s Project Texas?

In Lawfare, Matt Perault, director of the Center on Technology Policy at the University of North Carolina, outlines how TikTok’s much-ballyhooed Project Texas, a proposed data center hosted in Texas to insulate American data from Chinese manipulation, is stillborn.

Since that briefing in January 2023, Project Texas has barely registered in the policy debate about TikTok’s future. When [TikTok CEO Shou Chew] testified in Congress in March 2023, the most significant discussion of Project Texas was a Texas representative’s criticism that TikTok used his home state in the plan’s name. In an attempt to signal that its promises are not empty, TikTok has implemented many of Project Texas’s features, including transferring U.S. user data to the cloud infrastructure of Oracle, a U.S. company. But the few lawmakers who have raised Project Texas have dismissed it out of hand, while typically neglecting to mention any specific feature that is problematic or identifying possible remedies to address deficiencies.

Last week, when the House passed the Protecting Americans From Foreign Adversary Controlled Applications Act, a bill that bans TikTok unless it is sold so that it is “no longer being controlled by a foreign adversary,” it was clear that Project Texas has not had the impact on the policy debate that TikTok hoped it would. In retrospect, Project Texas was stillborn.

Why?

One view is that Project Texas did not sufficiently mitigate the risks posed by TikTok. Project Texas focused on addressing three main risks: first, that U.S. user data could be accessed by the Chinese government; second, that the Chinese government could influence content distribution on the platform; and third, that there would be insufficient transparency to understand and identify future risks as they arose.

To address risks of data access, Project Texas created a U.S. subsidiary called U.S. Data Security (USDS) to manage U.S. user data. USDS would store data on Oracle Cloud infrastructure, and the new subsidiary would be staffed by U.S.-based employees. Data stored in USDS could flow out of the United States in a limited set of circumstances, such as when a U.S. user messaged someone based outside the United States or posted a video globally.

To address risks that the Chinese government could influence content, TikTok would house the key content moderation functions within USDS, including the Trust and Safety and User Operations teams. Oracle would inspect source code within USDS, and TikTok’s recommendations algorithm would be subject to review by third-party auditors.

To address concerns that Project Texas would be insufficiently transparent, the plan included oversight and auditing features to help surface potential risks. In the briefing, we were told that seven entities would conduct oversight of various components of Project Texas, including the Committee on Foreign Investment in the United States (CFIUS), which is responsible for investigating national security risks of foreign investment in the United States; Oracle, the trusted technology provider; a source code inspector nominated by Oracle and approved by CFIUS to conduct an independent inspection of the source code; and a data deletion auditor to verify that all U.S. person data held on TikTok servers prior to the creation of USDS been successfully deleted. In addition, CFIUS would have broad authority to review the employees of USDS and to approve of auditors.

Lawmakers quickly dismissed the idea that these features could meaningfully combat potential national security risks. Rep. Cathy McMorris Rogers (R-Wash.), chair of the House Energy and Commerce Committee, dismissed Project Texas as a “marketing scheme.” Rep. Frank Pallone (D-N.J.) called it “simply not acceptable.” Rep. Jay Obernolte (R-Calif.) said that it would not be “technically possible” for TikTok to do what it said it would do.

Subscribe to keep reading

This content is free, but you must be subscribed to Metacurity to continue reading.

Already a subscriber?Sign In.Not now