2
0 Comments

How Our Website Was Attacked by a Recent Employee and What We Learned as a Result

Early Friday morning, I was awoken by the sound of the Intercom on our website. One of our users from a different continent had reported that a project he had been working on for the past few days had vanished from his account. I immediately checked the database, but everything seemed normal on the surface. The site was functioning and the database was accessible, yet I couldn't find any trace of the user or his project.

As a non-developer, I knew my technical understanding was limited, so we had to wait for Erik, our developer, to come online. Erik, an early riser, quickly logged into Google Firebase as usual, but something seemed off as he spoke a few seconds later.

"I don't understand," he said. "All users are gone. Everything - their projects, credits, history - it's all gone."

As it turned out, almost all of the information in our database had been deleted between 1 and 4 in the morning. The site was still functioning, but all of the user information had been erased.

"Did one of our developers do this by accident?" I wondered if perhaps a coding error had caused the issue. It never crossed my mind that we were under attack.

"No," Erik said firmly. "This was done by someone from the outside. It looks like an attack."

I was stunned. For years, I had been confident that our data was completely safe, thanks to the advanced databases we used on Google Cloud and the secure hosting provided by Amazon EBS. But somehow, something had gone terribly wrong.

"It looks like someone used the API key to run big queries," Erik explained. "There were more than 20 million data deletion requests in a few hours."

"How is this possible?" I asked, trying to wrap my head around what had happened.

"If someone has an API key, they can use the command line to run big queries," Erik said. "It looks like it was done by one of our former developers who had an API key saved on their computer."

I still didn't fully understand the situation, but Erik's normally calm tone was laced with concern, which was enough to tell me that this was a serious issue.


As a startup, how much time is spent on security in the early years of existence? From my experience, I can say that all efforts and time in the first years are focused on creating a product that will be in demand with users. This is a lot of experiments, a lot of discoveries, and even more disappointments. It is an attempt to advance the project with minimal resources, constantly competing with competitors who unabashedly copy our findings and use gray schemes in marketing and SEO, all while balancing on the brink of financial survival.

Security? First, we need to make something that works like a clock and brings profit. We'll think about security later, when we become profitable and when one of the market leaders notices us and wants to either acquire or kill us. That's when we'll have to be afraid of attacks. But for now, we need to focus on growth, using services like Amazon and Google Cloud to protect us.

But even if we wanted to focus on security, it didn’t look urgent. Experienced security professionals have advised us to pay attention to possible vulnerabilities, like protecting our site from malicious code that can be downloaded in a hundred different ways. But when we asked about the urgency, they told us it would be good to address these issues when we have time, but for now, what we have should be enough.

“In general, you need to think about safety all the time”, I was told by a friend who has been involved in security for 20 years.

But as a startup, it's not always easy to divert our developers from working on the project and switch them to a security setting for a week. We just can't afford it.

We occasionally would receive emails from bounty hunters reporting vulnerabilities they've found and asking for a reward for reporting them. These vulnerabilities are usually minor, and it seems like they're just running some kind of bug-finding program and sending out the results in the hope that someone will pay them.

In short, after two years of work, we've developed business development priorities, and the focus on security was not among the main ones.

Distract developers from working on a project and switch them to security tasks for a week? No! We cannot afford it.


Messages from users continued to pour in. Our users in North America were waking up. We quickly discovered the main issue: user data, or rather their projects, had not been copied, they had simply been deleted. However, financial information and personal data were safe, stored separately in Stripe with reliable protection. This was a relief.

"By the way, can we be attacked again?" I asked Erik.

"I changed the settings. Now they can't," he reassured me.

Another developer chimed in, "We urgently need to change API keys."

The panic began to subside as the team worked to restore the data. I focused on communicating with our users, informing them of the unforeseen circumstances and the possibility of system failures in the coming days. It was a stressful task, having to report that their projects were gone and we couldn't be sure if we could restore them. But I've learned that honesty is the best policy in times of crisis, along with staying calm, though that was easier said than done.

"I need to go outside for 10 minutes, get some fresh air," I told Erik.

"Of course," he replied, understanding.

I took a deep breath and exhaled, letting the oxygen clear my mind. The panic gave way to rational thinking. How could we recover the data? How could we protect the site from future attacks? Who among our friends might be able to help us in some way? And finally, who had attacked us and why???


The attack demonstrated that it was impossible to steal our data. It could only be erased. So what was the motive? Revenge? Competition?

We quickly came up with the most likely explanation: the attack could have been orchestrated by one of the contractors we recently hired through Upwork. We had a conflict with this individual due to the poor quality of his work, and after a dispute on the platform, he was forced to refund us part of the fee we paid. It seemed he didn't agree with the moderator's decision and immediately blocked all channels of communication and stopped responding to my messages.

Could it be this contractor? But why would he waste time organizing an attack that wouldn't bring any dividends? It seemed like revenge was the answer. The attack took place two weeks after our conflict, indicating that the individual was still angry. The timing of the malware launch - at night - also supported this theory, as it would have been the middle of the day for this contractor in Vietnam.

However, we couldn't definitively prove that it was this individual's attack. And actually it didn't matter. Whoever it was, they were able to run code in our database, and that was all that mattered.


We quickly realized that the data had been removed from Google Firebase, and none of our other databases were affected. I spent most of the day trying to reach out to all our contacts at Google, though I wasn't particularly hopeful. It was a Friday, and while friends responded quickly, they were unable to do much to help. Firebase support promised to get back to us by the end of the day, but it seemed unlikely they would have time to solve the problem.

As it turned out, we received a response from support by the end of the day, but it was just a series of questions about what had happened and what kind of help we needed. It was clear that we wouldn't get any assistance until Monday at the earliest. We were offered the option of subscribing to the more expensive Firebase support plan for $500 per month, which would allow us to get a response faster over the weekend. However, there was no guarantee that the data would be recovered, so we decided not to rush and instead spend the time assessing the damage, searching for any malicious code, and consulting with experts who were knowledgeable on the subject.

After that, all we could do was sit still and wait for updates.


On the other hand, we had to assess the business risks. What if we couldn't restore the user projects? Some of them had worked with us for three years and had a lengthy history. Losing data could be quite painful for them, though it probably wasn't critical. Projects become obsolete, and what a user did a year ago is completely irrelevant today. User responses confirmed this assumption. Most were surprised that the projects were lost, but they said the loss didn't impact their work in any way.

Consultations with security specialists helped us understand where we stood and how we should proceed. It wouldn't be useful to recount their specific recommendations, as they were tailored to our business. However, here are some general recommendations that could be helpful to anyone in a similar situation:

  • Immediately change all codes and protect all databases after an attack
  • Isolate and quarantine the entire area of the database that was attacked
  • Check all the code of the contractor that came under suspicion and make sure it doesn't contain any malicious code
  • Contact Google Cloud Services and Amazon AWS for advice on security settings
  • Set up backups, as Google Cloud Services doesn't make them by default
  • Set up a cyber security system as much as possible.

The week passed by in the blink of an eye. Erik worked on setting up security. Google told us they would try to recover the data, but it was a delicate process and they couldn't give any guarantees. So we just had to focus on improving our security system and wait for a response from Google.

I went through all the notes on security settings that I had compiled over the years, and felt like they had been ‘written by the blood of developers’. Before, when I heard about cyber security and intrusion/penetration tests, it all seemed a bit far-fetched. Now, these concepts seemed like centuries-old wisdom.

It was interesting to think back to a conversation I had a few months earlier with Hiten Shah who cerated software designed to protect an organization's accounts from intruders among former employees. He had asked for opinions about the product while interviewing potential customers. At the time, I had thought the software was only necessary for large companies with many developers and a high staff turnover. But life has taught me a lesson.


A week later, Google came back with great news - they had restored all user projects. It was a fantastic result, and to be honest, we didn't expect it. The users hardly reacted to the news, probably because the projects they were currently working on were more important to them. But for us, it was a victory - we had recovered all the user data.

During this time, we had strengthened the site's protection, and it is probably not as easy to hack Kukarella now. Or at least, we hope so.

Now It had been two weeks since the attack, and the excitement had died down. In the first few days after the attack, I checked the Firebase metrics several times a day and asked Erik to double-check everything was in order if I saw any unusual activity. Now, I go to the console as I did before the attack - only when I receive a message from a user asking us to check some data. Solomon was right - everything passes.

At first, my friends sympathized with us, blaming the attacker. But to be honest, I'm grateful for the attack. It forced us to put aside all routine tasks and focus on ensuring our complete safety. If it hadn't been for the attack, we might still be postponing everything until better times. And who knows - if the attack had happened later, the damage could have been much worse.

I couldn't resist and thanked that Vietnamese contractor for the attack and helping us find the vulnerability of the site, but he didn't respond. He probably blocked my email too.

on December 16, 2022
Trending on Indie Hackers
I wasted 6 months building a failed startup. Built TrendyRevenue to validate ideas in 10 seconds. User Avatar 50 comments Agencies charge $5,000 for a 60-second product demo video. I make mine for $0. Here's the exact workflow. User Avatar 45 comments Your files aren’t messy. They’re just stuck in the wrong system. User Avatar 27 comments Built a tool that finds which Reddit/HN threads are making ChatGPT recommend your competitors User Avatar 23 comments Why Direction Matters More Than Motivation in Exam Preparation User Avatar 14 comments I built a health platform for my family because nobody has a clue what is going on User Avatar 10 comments