1. Home
  2. Jimmy Su

Jimmy Su

Darknet bad actors work together to steal your crypto, here’s how — Binance CSO

Crypto hackers have turned their attention toward the crypto user, and “security hygiene” is more important than ever, according to Binance’s Jimmy Su.

Lurking in the shadiest corners of the dark web is a “well-established” ecosystem of hackers that target cryptocurrency users with poor “security hygiene,” according to Binance’s chief security officer, Jimmy Su.

Speaking to Cointelegraph, Su said that hackers had shifted their gaze toward crypto end-users in recent years.

Su noted when Binance first opened in July 2017, the team saw plenty of hacking attempts on its internal network. However, the focus has shifted as crypto exchanges continued to beef up their security.

“Hackers always choose the lowest bar to achieve their goals because, for them, it’s a business as well. The hacker community is a well-established ecosystem.”

According to Su, this ecosystem comprises four distinct layers: intelligence gatherers, data refiners, hackers and money launderers.

Data gatherers

The most upstream layer is what Su described as “threat intelligence.” Here, bad actors collect and collate ill-gotten intel about crypto users, creating entire spreadsheets filled with details about different users.

This could include crypto websites a user frequents, what emails they use, their name, and whether they’re on Telegram or social media.

“There is a market for this on the dark web where this information is sold [...] that describes the user,” explained Su in a May interview.

Su noted this information is usually gathered in bulk, such as previous customer information leaks, or hacks targeting other vendors or platforms.

In April, a research paper by Privacy Affairs revealed cybercriminals have been selling hacked crypto accounts for as little as $30 a pop. Forged documentation, often used by hackers to open accounts on crypto trading sites, can also be bought on the dark web.

Data refiners

According to Su, the data gathered is then sold downstream to another group — usually made up of data engineers specializing in refining data.

“For example, there was a data set last year for Twitter users. [...] Based on the information there, they can further refine it to see, based on the tweets to see which ones are actually crypto-related.”

These data engineers will then use “scripts and bots” to figure out which exchanges the crypto enthusiast may be registered with.

They do this by attempting to create an account with the user’s email address. If they get an error that says the address is already in use, they’ll know if they use the exchange, which could be valuable information that more targeted scams could use, said Su.

Hackers and phishers

The third layer is usually what creates headlines. Phishing scammers or hackers will take the previously refined data to create “targeted” phishing attacks.

“Because now they know ‘Tommy’ is a user of exchange ‘X,’ they can just send an SMS saying, ‘Hey Tommy, we detected someone withdrew $5,000 from your account; please click this link and reach customer service if it wasn’t you.’”

In March, hardware wallet provider Trezor warned its users about a phishing attack designed to steal investors’ money by making them enter the wallet’s recovery phrase on a fake Trezor website.

The phishing campaign involved attackers posing as Trezor and contacting victims via phone calls, texts, or emails, claiming that there has been a security breach or suspicious activity on their Trezor account.

A screenshot from a phishing domain copying Trezor’s website. Source: Bleeping Computer

Getting away with it

Once the funds are stolen, the final step is getting away with the heist. Su explained this could involve leaving the funds dormant for years and then moving them to a crypto mixer such as Tornado Cash.

Related: Arbitrum-based Jimbos Protocol hacked, losing $7.5M in Ether

“There are groups that we know that may sit on their stolen gains for two, three years without any movement,” added Su.

While not much can stop crypto hackers, Su urges crypto users to practice better “security hygiene.”

This could involve revoking permissions for decentralized finance projects if they no longer use them, or ensuring communication channels, such as email or SMS used for two-factor authentication, are kept private.

Magazine: Tornado Cash 2.0 — The race to build safe and legal coin mixers

Tether launches USDT payments for social security contributions in the Philippines

AI deepfakes are getting better at spoofing KYC verification: Binance exec

The technology is getting so advanced, deepfakes may soon become undetectable by a human verifier, said Jimmy Su, Binance's Chief Security Officer.

Deepfake technology used by crypto fraudsters to bypass know-your-customer (KYC) verification on crypto exchanges such as Binance is only going to get more advanced, Binance's chief security officer warns.

Deepfakes are made using artificial intelligence tools that use machine learning to create convincing audio, images or videos featuring a person’s likeness. While there are legitimate use cases for the technology, it can also be used for scams and hoaxes.

Speaking to Cointelegraph, Binance chief security officer Jimmy Su said there has been a rise in fraudsters using the tech to try and get past the exchange’s customer verification processes.

“The hacker will look for a normal picture of the victim online somewhere. Based on that, using deep fake tools, they’re able to produce videos to do the bypass.”

Su said the tools have become so advanced that they can even correctly respond to audio instructions designed to check whether the applicant is a human and can do so in real-time.

“Some of the verification requires the user, for example, to blink their left eye or look to the left or to the right, look up or look down. The deep fakes are advanced enough today that they can actually execute those commands,” he explained.

However, Su believes the faked videos are not at the level yet where they can fool a human operator.

“When we look at those videos, there are certain parts of it we can detect with the human eye,” for example, when the user is required to turn their head to the side,” said Su.

“AI will overcome [them] over time. So it's not something that we can always rely on.”

In August 2022, Binance’s chief communications officer Patrick Hillmann warned that a “sophisticated hacking team” was using his previous news interviews and TV appearances to create a “deepfake” version of him.

The deepfake version of Hillmann was then deployed to conduct Zoom meetings with various crypto project teams promising an opportunity to list their assets on Binance — for a price, of course.

“That's a very difficult problem to solve,” said Su, when asked about how to combat such attacks.

“Even if we can control our own videos, there are videos out there that are not owned by us. So one thing, again, is user education.”

Related: Binance off the hook from $8M Tinder ‘pig butchering’ lawsuit

Binance is planning to release a blog post series aimed at educating users about risk management.

In an early version of the blog post featuring a section on cybersecurity, Binance said that it uses AI and machine learning algorithms for its own purposes, including detecting unusual login patterns and transaction patterns and other "abnormal activity on the platform."

AI Eye: ‘Biggest ever’ leap in AI, cool new tools, AIs are the real DAOs

Tether launches USDT payments for social security contributions in the Philippines