Assess Your Readiness Now for the SEC Cybersecurity Disclosure Rules

By Michael Isbitski - MARCH 27, 2024


The SEC’s new ‘Cybersecurity Risk Management, Strategy, Governance, and Incident Disclosure’ rule (issued on July 26, 2023) has public companies, notably smaller companies, worrying about having enough cybersecurity expertise to run a security program consistent with SEC requirements. It’s important to remember that the SEC is expanding upon previously conveyed expectations that investors should be timely-informed of material risks and how the organization is mitigating these identified risks. To that aim, companies need to follow basic security best practices and maintain effective cybersecurity programs.

As many security practitioners and leaders would attest to, failure of an organization to exhibit basic security hygiene leads to security incidents and breaches that must be disclosed. Historically, we’ve seen how some security incidents can be catastrophic and lead to collapse of a victimized company. Where some organizations might stumble is having all the appropriate people, processes, or technology to support a cybersecurity program, but these elements should be required for operating a business. Sysdig covered this topic at length back in April 2023 in a LinkedIn Live session “Are You Prepared for the New SEC Cyberattack Disclosure Guidelines?

Finalization of the SEC cybersecurity disclosure rules is akin to a teacher giving you a pop quiz on something you should already know.

The SEC rules focus on improving three core elements of cybersecurity for publicly traded companies:

  1. Disclosing cybersecurity expertise
  2. Maintaining cybersecurity strategy, governance and risk management processes
  3. Disclosure of material cybersecurity incidents within four business days

These SEC rules aren’t new. They’ve been re-stated and refined in 2011, 2018, and now again in 2023. The SEC met once again and finalized the rules on July 26, 2023. Most aspects of the recently finalized SEC disclosure rules remained the same, but the requirement for board cybersecurity expertise was ultimately relaxed. All publicly traded companies, and those aspiring to be publicly traded, must comply. By virtue of software supply chains, this also quickly extends to and impacts private companies.

You might be asking why the SEC needed to do this at all. Industry seemingly has done plenty to standardize and promote cybersecurity. Simply stated, investors need to be able to assess the cybersecurity program and history of security incidents of an organization they’d like to invest in. This is not unlike how they would evaluate the financials of a publicly traded company as a gauge of stability before making an investment. In this context, the ‘cybersecurity’ disclosure rules are akin to the Sarbanes-Oxley Act that required public companies to disclose the status of their internal controls over financial reporting (ICFR) and whether there are any material weaknesses that would undermine the effectiveness of these controls. Events like SolarWinds demonstrated the far reaching impacts of a vulnerability or security incident, and these types of events can also be damaging to investors. The SEC works to ensure that investors are provided with timely, accurate, and complete information so they can make informed investment decisions. This should hold true for financials as well as cybersecurity posture particularly given the financial risks associated with a breach.

Review how the SEC disclosure rules impact you

CISOs lead their security teams and oversee their organization’s security programs. The quality of these efforts can have a material financial impact for the firm. High-profile incidents, coupled with the new SEC cybersecurity rules, means that cybersecurity is clearly a C-suite and board-level matter. The SEC was already requiring that companies disclose relevant information about cybersecurity incidents in quarterly or annual filings. This was before the finalization of the disclosure rules that now require an 8-K to be filed for a material cybersecurity incident. Any security program an organization implements should be auditable and defensible. The CFO knows the financial state of their company. In a similar vein, the CISO (or suitable alternative like the CIO) must know the security risks of the organization at any given moment. Multiple roles, notably the CISO and the executive leadership team, need either direct cybersecurity expertise and/or strong governance and oversight skills to ensure that the disclosure requirements of the SEC’s new rules are being adequately fulfilled.

Ideally, board members should also possess the requisite knowledge to oversee cybersecurity programs as part of their larger governance role. The board expertise requirement was relaxed in the final version of the rules, but a reasonable investor could draw inference that a board without technical knowledge may not be able to fulfill its oversight obligations. CISOs need not be on the board, but there’s an expectation that the CISO or designated alternative are able to communicate freely and with transparency. The CISO must understand the business, the industry, and general operating context of the company. Though not explicitly outlined by the SEC, CISOs need to be able to convey the most critical security risks with appropriate business context to be effective in board discussions.

The rest of the executive leadership team, and to a lesser extent the board, ideally need to have an appropriate level of technology understanding to ensure that cybersecurity programs are well managed and material risks to the organization are communicated to investors and other stakeholders. In some cases, an advisory function to the board may also be outsourced to a trusted third party that serves as a Qualified Technology Expert (QTE). The status of the company’s and the board’s cybersecurity expertise must be disclosed to the SEC.

The SEC disclosure rules provide provisions for smaller organizations or those with no CISO. Depending on an organization’s structure and resources, it’s not uncommon to have a CIO accountable for cybersecurity. Similar to failures on other filing requirements, the SEC’s enforcement action could include financial penalties, censuring executives who have made false statements, and potentially delisting companies from public exchanges for particularly egregious violations. All of these outcomes are far worse than most cyber attacks.

Disparity in language requires companies to be more deliberate in how they communicate. “Risk” to a CFO is something much different than to a CISO. The former thinks in terms of financial risk. The CISO thinks in terms of cybersecurity risks and how these could impact their organization if realized. “Security” to a financial person may invoke the notion of stocks or bonds, not pieces of technology or controls to protect it all. Security leaders must also better-frame the potential business impact of security risks instead of relying solely on vulnerability numbers. Security practitioners and leaders need to quickly “business up” so all parties are speaking the same language.

Traversing the gray area of materiality

The materiality threshold gives organizations, boards, and management a considerable amount of wiggle room. What constitutes a material cyber incident? We can gain some clarification by examining the definition of material facts pertaining to accounting errors as interpreted by the US Supreme Court, which is:

“a substantial likelihood that the … fact would have been viewed by the reasonable investor as having significantly altered the ‘total mix’ of information made available.”

Materiality can be impacted by quantitative or qualitative factors. Financial materiality may be determined by commonly defined terms of some percent of assets, liabilities, income, or expenses. For the lens of cybersecurity, this presumes that the organization performs a business impact analysis (BIA) to understand how technology failures impact those financial measures of materiality.

Rarely, does an organization perform risk management and BIAs across the board for all technology to answer this question appropriately and effectively. Additionally, organizations need end-to-end visibility into all operating environments, and that visibility must also be near real-time. For many organizations, that picture is point-in-time and/or manually generated. And finally, organizations must continuously gather sufficient telemetry within their operating environments to assess any business impacts and determine materiality of an incident if/when it occurs. It’s not enough to presume the criticality or sensitivity of a service or data since those labels can alter over time based on consumption patterns and volumes of data or records in aggregate.

Some questions that may be helpful when determining materiality for cybersecurity incidents include:

  • Would incident disclosure change the mind of an investor?
  • How can you quantify or qualify a given cybersecurity incident?
  • Are you considering impacts beyond lost revenue or asset cost such as brand or reputational risk?
  • How can you best prepare for discussions with the board related to cyber risk, and not just with a deluge of numbers?
  • How prepared is your organization to identify and report on cyber risks?
  • How does the organization translate technical or cyber risk into operational or enterprise risk?

There will likely be a period of growing pains for many organizations as they grapple with appropriate security technology and governance and disclosure processes to assess materiality of cyber incidents. Regardless of the subjectivity that comes into play interpreting events, it’s in the organization’s best interest to quickly report incidents, even those that may still be undergoing investigation. Materiality will likely take time to fully assess, or it may manifest later through additional forensic analysis. Failure to disclose promptly can warrant investigation by the SEC after the fact, invite scrutiny of the information (or lack thereof) that’s been submitted prior, and result in issuance of Wells Notices. In the case of the SUNBURST breach in 2020, SolarWinds disclosed two days after being notified of the issue, compliant with the SEC disclosure requirements. However, current and former executive officers and employees, including the CFO and CISO, are still being investigated per a recent Form 8-K filing.

Steps to follow next for publicly traded organizations

Review the SEC Public Company Cybersecurity Disclosure Final Rules and focus on the following dates:

  • All public companies must comply with the annual disclosure requirements for cybersecurity processes and expertise beginning December 15, 2023
  • Public companies must comply with incident disclosure requirements beginning December 18, 2023
  • Smaller companies are eligible for an extension on incident disclosure requirements and must comply by June 15, 2024

Cybersecurity posture varies greatly across companies, so guidance is generalized here. In no particular order, some ideas to get started include:

  1. Augment your risk management program as necessary to detect and respond quickly to cyber incidents for all operating environments. Scope includes traditional datacenter as well as cloud-hosted and cloud-native environments. If you’re already asking questions about the SEC disclosure rules, you’re likely ahead of the curve.
  2. Document what you’re doing in your cybersecurity program, including relevant standards or frameworks, and disclose the information as appropriate in a Form 10-K and Form 10-Q.
  3. Consider having a formal security program assessment by a trusted third party that evaluates the breadth and depth of your organization’s security program and its capabilities. Tie this assessment to a security framework or standard.
  4. If you haven’t settled on a cybersecurity program approach yet, start now. NIST CSF and NIST SP 800–53 are starting points for many organizations, as is the ISO 27000 series. Security requirements also vary based on your vertical or sector and applicable regulations.
  5. Examine where you have pockets of cybersecurity expertise, shift them as needed, and/or recruit as needed to fill gaps.
  6. Review DFIR processes and playbooks to ensure that relevant business details are gathered to assess business impact and help determine materiality of a given incident.
  7. Revisit security tooling to ensure that it provides end-to-end, real-time visibility into all operating environments that is also informed by runtime insight (i.e., you can’t rely on shift-left approaches alone). This better-equips you to assess your actual attack surface, determine where critical assets reside, and risk-prioritize effectively.
  8. Ensure that threat detection and response capabilities can quickly identify attacks. You must be able to quickly gather relevant telemetry to understand the scope of cyber incidents, their business impact, and their materiality to the organization.
  9. Validate that recovery mechanisms are adequate for quickly restoring operations after incidents occur and demonstrate cyber resiliency to investors.
  10. Extend incident response workflows beyond the expected SecOps components to include other relevant business teams like Finance and Legal. These teams should be included at appropriate points of DFIR once sufficient data has been captured that helps illustrate the business impact and materiality to the company. Security practitioners are more likely to view security incidents from a technical lens that includes factors like vulnerability severity, infrastructure misconfiguration, exploitability, or network exposure that can lead to a system compromise or data breach.

Too many cybersecurity efforts fail to get off the ground as teams get stuck trying to determine the best approach. Compare notes with industry peers, but also anticipate that many companies may also exhibit lower cybersecurity maturity. The SEC rules will help build more transparency around cybersecurity over time. Remember that the primary mission of the SEC is to protect investors. This shouldn’t be a big stretch from a customer-first mentality of successful organizations. In this case, the SEC is requiring this information on behalf of investors who are another type of customer.

Steps to follow if your organization suffers a cyber incident

Authority and accountability for the many elements of a cybersecurity program extends beyond just security or IT departments. If your publicly traded organization suffers a cybersecurity incident, steps you should consider include:

  1. Work with your CFO or counsel to file the Form 8-K with the SEC within four business days from the point you learn of the incident and also confirm that it’s material.
  2. Engage corporate communications and crisis communications teams to get ahead of potential negative media reactions. Rapid response and transparency are key.
  3. Ready sales playbooks as investors and sales prospects will be asking for details of the incident and may question the maturity of the organization’s cybersecurity program or its ability to respond and recover.
  4. Brace for customer questions about the impact to their organization and its data. They may also want to revisit contractual agreements if they question the validity of the organization’s cybersecurity approach. Customers here may also include partners and suppliers, not just end users.
  5. Prepare for litigation. Board members and management are frequently named in derivative lawsuits following incidents and breaches. Ensure that you are engaging with the organization’s legal teams and bringing them up to speed with what has occurred.

NOTE: This article was revised on 8/24/2023 to clarify the initial disclosure of the SUNBURST breach by SolarWinds and the current status of the SEC investigation.


This article was originally published on Medium.

Subscribe and get the latest updates