Responsible Vulnerability Disclosure: An Expert Guide for Security Researchers

Get advice from security experts on how to responsibly disclose vulnerabilities discovered without fear of legal consequences or prison.

Responsible Vulnerability Disclosure: An Expert Guide for Security Researchers

When it comes to responsible vulnerability disclosure, security researchers must understand how to disclose vulnerabilities safely to avoid legal consequences, fines, and worse—prison. Legal risks for security researchers are not to be taken lightly—even if security vulnerabilities are disclosed in "good faith", without practicing proper responsible vulnerability disclosure, the risk of legal action taken against a security researcher remains.

In this Q&A-style article, we will interview experts in the cybersecurity field which include a penetration tester, privacy lawyer, and security executive in the United States, including a world-renowned convicted hacker from Uruguay. We will also review the best practices for responsible vulnerability disclosure, providing a comprehensive guide to help security researchers navigate legal risks.

Why Responsible Vulnerability Disclosure Matters

Whether you've discovered software vulnerabilities, network vulnerabilities, or app vulnerabilities, it's critical to follow proper legal channels and obtain the necessary permissions required before reporting any security vulnerability to an organization or the appropriate authorities (e.g., the FBI, the Department of Homeland Security, or CISA). That's because there are several legal consequences of improper vulnerability disclosure that security researchers are likely to face, even if done unknowingly, and include:

  • Lawsuits: Companies can file lawsuits against security researchers, which may result in legal expenses, fines, or criminal charges under laws like the CFAA. Researchers are at risk of being accused of unauthorized access, accessing beyond permitted areas, or causing harm to a computer system.
  • Imprisonment: Mishandling vulnerability disclosure can escalate to criminal prosecution, and in extreme cases, researchers might face jail time.
  • Financial Loss: Legal disputes can become very costly, requiring specialized legal counsel, court fees, and fines, while researchers may also lose income during these proceedings. Bail costs can further complicate matters.
  • Damaged Reputation: A security researcher's reputation can be irreversibly tarnished, affecting future job opportunities in the cybersecurity industry.
  • Retaliation: Without a safe harbor agreement or VPD, anyone who discloses vulnerabilities is not protected and is at significant risk of facing legal consequences if organizations choose to retaliate and pursue legal action or report them to authorities.
  • Blacklisted: You can be viewed as a liability and become barred from vulnerability disclosure programs (VDP), any bug bounty program, or professional opportunities.

Additionally, from an organizational perspective, orgs face an increase in security risks when vulnerabilities are improperly disclosed. Publicizing vulnerabilities without proper handling increases the chances of zero-day attacks and exploitation. Companies may also face regulatory fines for failing to address vulnerabilities, while their reputation may be damaged due to remaining security flaws. Finally, orgs with a poor history of responding to ethical hackers’ vulnerability disclosures are at a greater risk of facing significant security breaches.


What Should Hackers Know About Disclosing Vulnerabilities Safely?

We interviewed Alyssa Miller, a seasoned hacker, and BISO with over 15 years of experience in cybersecurity. We asked her for advice on what hackers should know to disclose vulnerabilities properly, especially if they are fearful of legal repercussions. Here are some key takeaways:

Understand the Company’s Policies

Before diving into any system, it’s crucial to understand if the organization has a bug bounty program or clear reporting procedures.

Alyssa explains, "Before you go digging into any site, you want to be careful about understanding if they have a Bug Bounty program, a reporting policy, and making sure you clearly understand and follow their process. Sometimes, we may discover things by accident, and that’s when it’s far trickier."

Request a Safe Harbor Agreement

Before disclosing any vulnerabilities, always seek legal protection to avoid potential retaliation. According to Alyssa Miller, securing a Safe Harbor Agreement is an essential step in protecting yourself from legal consequences.

"It’s important to reach out to an organization and ask for a Safe Harbor Agreement before telling them you discovered anything. The Safe Harbor Agreement allows you to report a vulnerability without any threat of legal action. And if they don't agree to it, then you don't disclose the vulnerability."

Alyssa further stresses the importance of withholding sensitive details if an organization refuses to sign the agreement.

"You have to understand that if they won't give you a safe harbor agreement, you don't give them additional details."

She advises being clear but cautious when approaching companies. 

"You can go so far as to say, 'I’ve discovered something that appears to be a significant vulnerability in your web app. I’d like to disclose it, but I’d like to have a Safe Harbor Agreement in advance,' so they agree they won’t pursue legal action."

Document Everything Thoroughly

Alyssa underscores the importance of keeping detailed records of your work:

"The most important thing is to be very detailed in your documentation, to be very clear on what you did, and honestly, be careful that you don't stumble into private data or something else. So let's say as you build a POC (proof of concept), wherever possible, stay within your data."

If you’re able to create multiple accounts, ensure that you only access data you're authorized to use. This is crucial for avoiding legal complications, as Alyssa warns: 

"You have to be exceptionally careful."

To explore the legal side of vulnerability disclosure, we spoke with PrivacyLawyerD, a data and privacy lawyer (whose full name is withheld for privacy reasons) with experience at a leading global streaming platform. We asked him for guidance on how security researchers can safeguard themselves legally when disclosing vulnerabilities.

Consult a Lawyer Before Disclosing Vulnerabilities

Before taking any action, PrivacyLawyerD stresses the importance of legal consultation.

"Consult with a lawyer first for your own situation. The safest way for a security researcher is to make sure that the entity you're researching has a robust Bug Bounty program, robust procedures in place for reporting, and if there's an informal history of how the business responds to inquiries."

According to PrivacyLawyerD, understanding the legal framework and consulting a lawyer—who can offer professional advice specific to your situation—is critical before making any disclosures. Even if the organization has a Bug Bounty program, there can be risks if they don’t follow their own guidelines.

Be Aware of Bug Bounty Program Risks

While Bug Bounty programs are designed to encourage vulnerability disclosure, PrivacyLawyerD points out that not all companies follow their own rules.

"What really pisses me off is when they claim to have a Bug Bounty program and don't abide by their own rules because it’s just a matter of respect between companies and researchers out there that want to help. Because most researchers are ethical and decent people and also want to help these companies."

PrivacyLawyerD advises researchers to investigate how a company has handled disclosures in the past to avoid unnecessary risks.

Even if you’ve acted ethically, the legal ramifications can be severe if the company retaliates. PrivacyLawyerD recommends making documentation and legal consultation an essential part of your process.

"Ultimately for a researcher, if you're worried about legal ramifications, have a conversation with a lawyer before you disclose anything. Usually, they’re free, and if it’s a professional consult, they'll tell you if you need to talk further for a complex situation."

Real-World Lessons from Alberto Daniel Hill

We spoke with Alberto Daniel Hill, the first hacker in Uruguay to be imprisoned for disclosing vulnerabilities. His story sheds light on the risks security researchers face when legal protections aren’t in place. We asked him what he would have done differently.

"I used to always say if I had to live my life again and all the outcomes of my decisions, I would make the decisions again. But after my situation that destroyed my life and took everything away from me, I wouldn’t have been so naive, and I wouldn’t have reported anything, because what I reported to the medical provider was only 1% of the reports I’ve made as a professional security researcher."

Lessons from Hill's Experience

Hill’s story emphasizes the importance of being cautious, especially when dealing with companies that don’t have clear policies or agreements in place. His situation serves as a powerful reminder that even ethical actions can have devastating consequences without the proper legal protections in place.

How Hill's Case Could Have Been Different

Hill believes that having better legal safeguards and protections could have changed the outcome of his case. His advice to other researchers: take every precaution, no matter how small the vulnerability you’re reporting.

Don’t Be Naive—Always Protect Yourself

Hill reflects on his own naivety, admitting that he trusted the system too much. His story is a sobering reminder for all security researchers to never overlook the importance of legal protection.

"If I had to live again, I’d be a hacker again, but I wouldn't be so naive to trust people like I did and report those things. For me, it’s complicated because I should want to help and do the right thing. But doing that and dealing with a legal system--a powerful corrupt legal system--if I could back in time, I wouldn’t do that because it was the Login to Hell. I had to live in hell. My life was destroyed and I’m still recovering, and I don't know if I can recover completely again and the events of that report. "
"I hope my case will eventually end and change the way computer crimes are being persecuted in my country, and I have a petition. It’s absurd we don't have computer crime laws here in Uruguay as if we are in the 19th century. I don't want people to suffer what I did. I want to be the first and last person to go through that due to the incompetence of police and justice of society in my country."

To Find or Not to Find A Vulnerability

Fear of legal consequences has discouraged many security researchers from disclosing vulnerabilities, especially after hearing stories like Alberto Daniel Hill's or other cybersecurity professionals charged with computer crimes. We asked Phillip Wylie and Alyssa Miller for their thoughts on whether it's ethical or even advisable to find and disclose vulnerabilities without permission. Here's what they shared:

Phillip Wylie shared his perspective: 

"If their intention is good, they should have some kind of permission. Or some kind of Bug Bounty. You definitely need that 'OK' to do that. You don’t want to test things without permission because sometimes people stumble across things when they shouldn’t."

Phillip Wylie also touched on how ethical hackers can face legal consequences: 

"What it does is, it makes the industry look bad as a whole. Ethical hackers are supposed to help. But when someone engages in criminal activities as an ethical hacker, it blurs the lines between right and wrong."

What Makes a Security Researcher a Criminal?

Alyssa Miller highlighted the fine line between ethical research and criminal behavior, explaining how the CFAA (Computer Fraud and Abuse Act) can be broadly interpreted. 

"It’s hard to draw a specific line. If what you’re doing isn’t for personal gain or to embarrass someone, that’s where the line starts. As long as you’re not accessing data you’re not supposed to access, like in a proof of concept, you should be okay."

Advice for Businesses on Responding to Vulnerability Disclosures

Organizations often approach vulnerability disclosures with fear, uncertainty, and doubt (F.U.D.), labeling ethical hackers as "bad guys." Alyssa Miller and PrivacyLawyerD provide insights on how businesses should respond to good-faith disclosures from security researchers.

How Businesses Should Treat Ethical Hackers

Alyssa Miller shared her thoughts on how organizations can build trust with security researchers:

"Start by giving them the benefit of the doubt and establishing a cooperative relationship. Most of the time, researchers have good intentions, or they wouldn’t disclose the vulnerability."

She further suggests offering compensation to researchers, even if they don’t demand it, and negotiating in good faith.

"You should look into the possibility of how you will offer Bug Bounty rewards, and what criteria you’ll create in how those Bug Bounties should be." And if they're not demanding compensation, still consider compensating them, especially if they are coming to you by reporting something to you they don't need to. "They are doing you a favor."

Building Stronger Relationships with Security Researchers

Alyssa Miller emphasizes that many organizations jump to legal action because they lack security awareness or security maturity, which leads to misguided reactions. She explains:

"Organizations that lack security awareness are more likely to take legal action against researchers, but security researchers usually have good intentions."

Alyssa further continues:

"Cybercriminals are not typically going to you to report a vulnerability to you. In rare cases, a cybercriminal leveraged a vulnerability for gain and then reported it. Security researchers' intentions are good."

Understanding the Intentions of Security Researchers

Alyssa Miller stresses that security researchers generally have good intentions, even when they request compensation for their work. She elaborates: 

"Even when demanding payment, although I don't like the idea of demanding payment when you’re not contracted with an organization, their intentions are good."

Organizations should recognize this and avoid creating an environment of distrust. By doing so, they can prevent driving security researchers to disclose vulnerabilities publicly rather than through appropriate channels. Miller explains: 

"Using that opportunity to go after somebody just sows distrust, and ultimately it is going to lead researchers to not disclose anything to your company, and instead disclose it publicly. Would you rather encourage them to disclose to you rather than go public?"

Why Public Disclosure Happens: Trust Issues in Vulnerability Reporting

Alyssa Miller notes that taking aggressive action against security researchers can lead to unintended consequences. She explains: 

"So, using that opportunity to go after somebody just sows distrust, and ultimately it is going to lead researchers to not disclose anything to your company and instead disclose it publicly."

She challenges organizations with a thought-provoking question:

"Would you rather encourage them to disclose to you rather than disclose publicly?"

PrivacyLawyerD warns businesses about the consequences of ignoring or retaliating against ethical hackers: 

"Companies often freak out when vulnerabilities are found. But the cybersecurity community respects companies that come forward and address their issues.  When businesses sue researchers, they look like the bad guys. In reality, breaches are far more expensive than paying for Bug Bounties."

He further adds: 

"If a researcher comes to you in good faith, work with them. If you don’t, you risk that vulnerability being exploited by someone with malicious intent."

Why Businesses React Negatively to Vulnerability Disclosures

PrivacyLawyerD explains the typical reactions of businesses: 

"This is all anecdotal, but from what I’ve seen, businesses take action against security researchers because they’re nervous, freaking out, and trying to cover their asses. I think it’s an ego thing too. A lot of companies with security issues would be respected by the security community if they just admitted the problem, fixed it, or told researchers what they were working on."

He continues: 

"Listen to the advice from security researchers and pay out on your Bug Bounties to those researchers who want to help you. Otherwise, when these companies sue researchers and take aggressive actions, they look like the 'bad guys' in the situation."

PrivacyLawyerD adds that retaliation from companies is often counterproductive: 

"When businesses take legal action instead of working with researchers, they’re not willing to learn, grow, or improve their positions."

Nobody’s security is 100%, and nobody’s privacy is 100%. If you don't work to learn everything you can from privacy and security, then you're setting yourself up for failure.

He also notes: 

"When a security researcher comes in good faith to let the company know about a vulnerability, the company should jump at the opportunity to fix it. If you don’t, someone with less ethical intentions will exploit that vulnerability."

The Impact of Treating Security Researchers Like Whistleblowers

PrivacyLawyerD highlights the larger issue at play: 

"The way companies treat security researchers mirrors how whistleblowers are treated in society. Whistleblowers have been treated badly in every aspect, and this has damaged security. Security researchers are not trying to be whistleblowers, but the paradigm of how whistleblowers are treated has infected the way companies treat them."

How Companies Can Create a Secure, Cooperative Ecosystem

PrivacyLawyerD emphasizes: 

"If you’re a company and somebody discloses a vulnerability to you, treat them seriously and with respect. They’re doing you a favor. If you get defensive, it makes you look bad. Respecting security researchers helps sustain a healthy cybersecurity ecosystem."

He also recommends: 

"If you don't have a Bug Bounty program, please set one up. And if you do set one up, follow the terms ethically as outlined in your own program."

Dealing with Bug Bounty Program Failures

PrivacyLawyerD shares advice for researchers who encounter companies that fail to follow their Bug Bounty programs: 

"If a company doesn't abide by their Bug Bounty program, it puts the security researcher in a difficult situation. I think ethically, if the company tries to patch it secretly or doesn’t follow the terms, the researcher should publish the findings. You don’t want to release vulnerabilities, but sometimes you have no other option."
"If the company is not honoring the Bug Bounty, there’s no harm in publishing what you found in a minimal and non-harmful way to the entity. PR be damned if they don’t follow the program because it makes them look bad and like a fool."

Companies Should Respect Security Researchers and Vulnerability Disclosure

In closing, PrivacyLawyerD shares an important piece of advice for companies:

"If somebody discloses a vulnerability, treat them seriously and with respect. They’re doing you a favor. Acting defensively just makes you look bad. Respecting security researchers helps sustain a healthy cybersecurity ecosystem. Bug Bounties can be expensive, but breaches are far more costly."