👈 riemaeker.xyz

Salvaging a Malware-infected Website, Part 1

March 25th, 2026

(Heavy sigh.)

I'm afraid this post reflects very badly on me as a system administrator. Server setup and maintenance is a substantial part of what I do at work these days, so I feel some trepidation admitting to what I'm about to admit to. But hell, let's write it anyway.

Set it and forget it 😚

Back in the second Obama administration I built a website for a small local business as a side project. The client was a mom-and-pop equipment rental place owned by a friend of a family member. The website was nothing fancy — it just needed to look good, be fast and show their product catalog and contact info. It also had a basic CMS where the owners could log in and add, update or remove products. I wrote the thing in Laravel and deployed it to a cheap VPS on DigitalOcean. They were happy with the result, I got paid a satisfactory amount and all was well.

I then proceeded to almost completely ignore this website for a decade. In my defense, your honor, the damn thing kept running just fine! The client never reported any problems or asked for changes, apart from a few tweaks to the contact info early on. Meanwhile I was busy making my 30's happen: working, building a house, starting a family, more work. It did occasionally cross my mind that I should probably check on that website and do some maintenance, but that concern never rose to the top of my to-do list — even though I really knew better at this point.

The situation suddenly changed a few weeks ago.

The Discovery 👨🏻‍💻

I've been doing some reviewing of my online credentials lately — going through 1Password to check for breaches, update credentials or throw them away if no longer needed. One of these was the SSH key for that DigitalOcean droplet, so I figured I might as well check if it still worked.

% ssh root@website root@website's password: ■

Oops, turns out I didn't load that key into my SSH agent properly … but that should just give me an error, not a password prompt? Nowadays the first step I take when configuring a web server is to disable password login, but apparently not in 2015. Luckily I still had that password saved too, so let's continue.

Welcome to Ubuntu 14.04.5 LTS (GNU/Linux 3.13.0-63-generic x86_64) * Documentation: https://help.ubuntu.com/ System information disabled due to load higher than 1.0 New release '16.04.7 LTS' available. Run 'do-release-upgrade' to upgrade to it.

Oof, OK. The server is still running Ubuntu 14, which stopped receiving security updates in 2019. I should definitely make some time to upgrade that, and disable password logins while I'm at it.

That System information disabled due to load higher than 1.0 catches me off guard, though. I'm pretty sure this website rarely gets enough traffic to push CPU usage beyond single digits. So what's eating the resources? This is when I begin to feel a little uneasy.

root@website:~# top top - 21:53:14 up 706 days, 1 min, 2 users, load average: 1.05, 1.07, 1.05 Tasks: 86 total, 2 running, 84 sleeping, 0 stopped, 0 zombie %Cpu(s): 99.7 us, 0.3 sy, 0.0 ni, 0.0 id, 0.0 wa, 0.0 hi, 0.0 si, 0.0 st KiB Mem: 1017716 total, 910564 used, 107152 free, 94756 buffers KiB Swap: 1048572 total, 78880 used, 969692 free. 360704 cached Mem PID USER PR NI VIRT RES SHR S %CPU %MEM TIME+ COMMAND 18040 deploy 20 0 307452 272520 4 S 97.4 26.8 1870:42 kauditd0 1091 deploy 20 0 338432 2604 2500 S 0.3 0.3 37:59.43 php5-fpm 1 root 20 0 33512 2320 1172 S 0.0 0.2 36:26.54 init 2 root 20 0 0 0 0 S 0.0 0.0 0:00.01 kthreadd 3 root 20 0 0 0 0 S 0.0 0.0 100:52.40 ksoftirqd/0 5 root 0 -20 0 0 0 S 0.0 0.0 0:00.00 kworker/0:0H 7 root 20 0 0 0 0 S 0.0 0.0 660:55.72 rcu_sched 8 root 20 0 0 0 0 R 0.0 0.0 719:24.81 rcuos/0 9 root 20 0 0 0 0 S 0.0 0.0 0:00.00 rcu_bh 10 root 20 0 0 0 0 S 0.0 0.0 0:00.00 rcuob/0

Something is running on this server that uses nearly the whole CPU and a good chunk of RAM. A look at the charts on DigitalOcean confirms that's been the case continuously for at least two weeks. kauditd0 sounds like a Linux kernel thing, but it's running under my deploy user — which is only meant to run the website itself (that's the php5-fpm there, chugging along happily with its 0.3% sliver of CPU time).

Maybe it is an OS thing anyway that got stuck and is spinning its wheels for some reason? I reboot the server but kauditd0 comes right back, gnawing down on those processor cycles. Time to investigate.

(Cracks knuckles.)

Exposing the Crime 🔍

First let's find out where our CPU-eater lives and how it keeps coming back.

root@website:/# ls -l /proc/18040/exe lrwxrwxrwx 1 deploy deploy 0 /proc/18040/exe -> /home/deploy/.configrc7/a/kswapd00

kauditd0's process ID leads to an executable file in a directory under my deploy user's home folder. It doesn't require a Torvaldian level of Linux knowledge to find out how it survives a reboot: there are a bunch of entries in deploy's cron config pointing to the same directory.

deploy@website:~# crontab -e */30 * * * * /tmp/.kswapd00 || /home/deploy/.configrc7/a/kswapd00 > /dev/null 2>&1 5 6 */2 * 0 /home/deploy/.configrc7/a/upd>/dev/null 2>&1 @reboot /home/deploy/.configrc7/a/upd>/dev/null 2>&1 5 8 * * 0 /home/deploy/.configrc7/b/sync>/dev/null 2>&1 @reboot /home/deploy/.configrc7/b/sync>/dev/null 2>&

That directory contains several binaries and one readable shell script. The script base64-decodes a very long string of gibberish, writes the result to a file and runs it through … Perl? There's a language I hadn't heard of in a while. I grab the string from the script, decode it and take a look.

The Sudden Brazilian 🇧🇷

The script is over 900 lines and a lot of it is in Portuguese — Brazilian Portuguese, to be precise.

#!/usr/bin/perl ########## CONFIGURACAO ############ my $processo = 'edac0'; $servidor='0xB32B8B53' unless $servidor; my $porta='443'; my @canais=("#001"); my @adms=("molly","polly"); my @auth=("localhost"); # Anti Flood ( 6/3 Recomendado ) my $linas_max=5; my $sleep=5; ...

It's an IRC bot that enables remote control of the system: the attacker can run arbitrary commands, download files, whatever. The only limit on its power is that it runs as the deploy user, which has limited control and can only access certain directories — but that doesn't prevent the script from downloading and running anything it wants to (almost) fully exploit the CPU.

I won't go over the whole implementation in detail, but this is what I could find out about how it works:

Apparently this kind of IRC bot malware has been around since the mid-2000's in various permutations, and it has its roots in the Brazilian and Romanian hacking scenes. Cool!

What's not cool is that somebody (or some automated thing) has clearly gained illicit control of my server — probably months or years ago — and is relentlessly exploiting its resources. I have a suspicion what for, but let's check to make sure.

Big Cash Money 💰

Inspecting the network traffic reveals a steady stream of activity, but not a lot of meaningful info at first glance. But if the CPU-eating process is what I suspect, it will probably start off with some kind of handshake or login call when it first starts up. Let's use tcpdump to look for signs of that, and then kill the kauditd0 process so we can see what happens as it automatically starts up again:

deploy@website:~# tcpdump -i any -n | grep -E "handshake|submit|login|user" .4...y..{"id":1,"jsonrpc":"2.0","method":"login"," params":{"login":"483fmPjXwX75xmkaJ3dm4vVGWZ LHn3GDuKycHypVLr9SgiT6oaZgVh26iZRpwK EkTZCAmUS8tykuwUorM3zGtWxPBFqwuxS","pass":"x","agent":"XMRig/6.2 2.1 (Linux x86_64) libuv/1.44.2 gcc/10.3.1","algo":["cn/1","cn/2","cn/r","cn/fast","cn/half","cn/xao", "cn/rto","cn/rwz","cn/zls","cn/double","cn/ccx","rx/0","rx/wow","rx/arq","rx/graft", "rx/sfx","rx/yada","argon2/chukwa","argon2/chukwav2","argon2/ninja","ghostrider"]}}

Aha! This is followed by a series of these, on a steady cadence:

{"jsonrpc":"2.0","method":"job","params":{"blob":"1010cede9ccc06c996bf50c8e 31177cb3f949c792161f328cef56192fb7a7256fae4cf12eb019a000000a7661a3c55548ce471a643279ef 1c53f68dad4e58e0c5158c680abb065c0b1bcb58f01","job_id":"33488872","target":"1b430000", "algo":"rx/0","height":3604814,"seed_hash":"3ea19a5eb0995910bcef4248296a968638d4a8e13e c8494cd597b2f36362a244"}}

What we're looking at is a handshake with a Monero crypto wallet, followed by a bunch of work assignments from a mining pool to compute hashes. I don't completely understand how this works on a technical level, and would honestly prefer to keep it that way (I'm still psychically recovering from the great NFT craze of 2022). But I know enough to know what's going on now: my server's CPU is being used to convert these hashes into fake Internet money.

By the way, this is an astonishingly ineffective way to mine crypto. Serious mining happens on big, expensive GPU's built for massively parallel number crunching — the kind of computational monsters that power AI workloads these days. These guys are stealing cycles from a single virtual CPU on a $10/month VPS that was allocated a small fraction of a real, physical server. Considering the algorithms listed in the login call and the amount of compute power on offer, my server is probably making them about $0.02 a day.

Then again — I don't know who or what is behind this, and how big their operation is. They could have thousands of poorly-protected servers like mine under their control, some of them much more powerful than mine — which could add up to serious money, I guess. I do have the wallet's address from the login call, but Monero is a "privacy coin", which means I can't access its total balance or find out who owns it. Sad.

In this case it's kind of a victimless crime — the software is clever enough not to hog 100% of the resources, and modest applications like my website work just fine on a handful of leftover processing power. The actual victim I guess is Digital Ocean, who has been footing the bill to power all these extra CPU cycles in one of their data centers just to generate some more heat and 2 cents a day for a criminal enterprise.

I do kind of feel bad about that, even if it doesn't amount to anything in the grand scheme of things. Sorry, Mrs. and Mrs. Digital Ocean 🥺

Now It's a Rescue Mission 🙎🏻‍♂️

Well that was a fun ride, but now it's time to put a stop to this — and to make amends for a decade of shameful neglect. Tune in next time for part two, when we rescue our website from the hands of those sinister Brazilian crypto miners and take measures so this kind of thing won't happen again.