Category Archives: Stories

International Threat Rating System

How to Provide Transparency

In the 2010’s, the organization was primarily US based, with an employee presence in many countries. This US based intranet had a few portals to the internet but was otherwise an isolated network with segments that served literally hundreds of buildings or campuses across the US. All other locations accessed the intranet by connecting through the perimeter. Over time selected organization buildings at international locations were connected to the intranet.

The criteria for approving and international connection to the organization intranet included business and security considerations. The dominant element in the security criteria was country threat, which was subjectively understood by security personnel, particularly those involved in investigative support. But there was no means to objectively define the threat to executives. Frustration and anger was common.

By this time, I had been studying and using FAIR for several years. My partner in the International organization had become aware of a variety of country rating systems. This spawned the question: can multiple country rating systems be combined and weighted to provide a transparent country threat rating?

The solution had three elements. First all sources had to be normalized to a 0-100 score. Second, each rating was aligned with an element of the FAIR taxonomy elements related to threat, specifically Threat Event Frequency (emphasizing Probability of Action) and Threat Capability. Finally, the elements were given weights, recognizing that some had a stronger correlation to threat than others. Organization ratings that informed this analysis were also incorporated. The result was displayed as a sorted list of threat ratings; those below a threshold could be allowed to connect to the intranet. While executives who were told no were still not happy, the method made the decision process transparent thus eliminating surprises.

Implementing New Security Service

From Requirements through Operation

The emergence of the Advanced Persistent Threat triggered a frenzy of strategic planning resulting in identification of a list of prioritized control improvement projects. Most fell within the responsibility of existing teams. However, one did not have an obvious owner. After a few less than successful attempts by others, I stepped up to lead the project.

The Lockheed-Martin “Kill Chain” is a common means to refer to the stages or steps that the threat actor takes from the beginning of an attack to succeeding in their objective. My assignment was to reduce the probability of information theft at the point where the threat actor controlled an identity in the system that had authorized access to Office files contain in Windows file shares. The contributing fact is that access is granted to file shares as needed to perform assignments, but the access authorization frequently remains in place long after the need has ceased. I first became aware of this problem in the late 1980’s, but it had defied solution all that time.

Steps in the project included documenting the objectives and ultimately the requirements for the solution. My earlier work in developing and maintain the security control framework lead me to review our security requirements manual for relevant requirements. I was delighted to discover text that could be incorporated into the project requirements document, rather than having to invent requirement text. There were two relevant requirements: removing unneeded access authorizations, and monitoring for anomalous behavior. Detailed functional requirements were developed to support the requirements.

A market search was performed to identify candidate solutions. The candidate products were each evaluated against the detailed requirements. Only one product was responsive to all main requirement areas, so it was ultimately selected and implemented. I’ll skip all those details except one. During the proof of concept (pre-production) phase we operated the system with a very limited scope of file-share servers. During that time, we discovered that one user was reading an extraordinary number of files. We ultimately turned the data over for investigation as a potential insider threat. Of course I was not in a position to learn the outcome of the investigation. But it was heartening to confirm the solution met the expectation.

Quantifying Information Risk

A FAIR Way from Fear, Uncertainty and Doubt

Organization executives started paying attention to computing security in the early 1990’s. Digital networks had been established to support selective communication with suppliers, evolving to include massive digital product definition data. On one hand executives expressed vague fears, uncertainty and doubt about the potential for data loss, yet at the same time challenged costs for proposed security control improvement projects. The typical security executive presentation at least introduced the idea of a scale balancing security costs against the potential for loss but relied on personal persuasion to gain support for projects.

The organization used a 5×5 risk grid for reporting schedule, cost and technical risk associated with projects. Placement of a “risk” on the grid was the subjective judgement of a project manager, with adjustments to the status associated with a “risk mitigation” plan. The security function adopted use of the risk grid primarily because it was recognized. While many organizations labeled any technical “vulnerability” finding or a non-compliance as a risk, we used the grid to communicate only the primary risk scenarios for the organization. An example for illustration: Theft of proprietary information by insiders. Note that the asset (proprietary information) and the threat agent (insiders) are specifically identified. While this focused attention of the few key risk scenarios, the lack of granularity in the 5×5 matrix made communicating progress on risk reduction impossible.

The FAIR methodology first came to my attention around 2010. I was attracted to the clarity provided by the FAIR taxonomy in understanding how to measure risk. However, lack of a tool for performing a FAIR risk analysis stymied progress at the time. Fortunately, now there is are some choices of tools available to perform FAIR risk analyses.

Preparing for Regulators

Compliance to requirements for protecting classified information

My first job in computing security in 1982 was facilitating compliance with US Government regulations for protecting classified information in computing systems. Each system was required to have a control plan that identified the individuals responsible for the system, the configuration of the system including both computing hardware and physical controls, and the process for operating the system with classified information. The document was approved by the government, by agents who came to inspect all systems at least annually.

Some of the company managers and custodians for these systems viewed me as an adversary, since the regulations imposed limitations and extra effort not required for normal company use of the systems. But in all cases I endeavored to help them understand that the requirements were simply one of the many requirements associated with the contract their organization had with the government. My role was to provide experience and common practices to enable them to perform in a compliant manner as efficiently as possible. Some systems, particularly computerized text equipment, required creativity to allow tests to be performed while complying with requirements.

I visited each system whenever there were changes requiring approval by the regulator. Many systems required lots of collaboration in developing processes that complied with the intent of the requirements, which paid off with first-time approvals in nearly every situation. I also facilitated a self-inspection to ensure readiness for the government inspection. During inspections by a regulator, I was present in part to put the custodian at ease and in some instances to help ensure that the regulator and custodian were understanding each other. When issues arose, I made note of each item to ensure timely resolution.

Later in my career as I was involved in shaping the organization’s unclassified computing security program, I came to appreciate many elements of the government’s program for protecting classified information.

Creating Computing Security Requirements

Coordinating Among Organization Security Leaders

I entered my first job in computing security in 1982 coincident with the publication of the company’s first computing security requirements manual. The protection in the requirements were grouped by those assigned to managers responsible for systems and those assigned to users. Each contained sentences that described the requirement listed in no particular order. I know nothing of the origin of the requirements, but they could be described as best practices of the time.

After about a decade it became apparent that a complete overhaul of the requirements manual was needed, after years of complaint particularly about who was responsible. The organization also lacked any easy way to report compliance in part due to a lack of titles for the many detailed requirements.

A few years earlier I had championed the development of a computing security section for the company’s security manual for government classified information. I learned about working with technical writers, and the methods for organizing and presenting written procedural information. When I was reassigned to the unclassified computing security function, I made a point of getting to know the information technology managers who were members of the computing security working group.

The process of writing the new computing security manual involved a technical writer, technical computing security staff and perhaps most important, the computing security working group who would approve the document. Our first breakthrough was establishment of a high-level outline for the requirements, in essence the security control framework. A writing approach was established to articulate the high-level requirement, then to label and describe each detailed requirement. Finally, the existing requirements were rewritten in the new style, plus many new requirements to fill-in obvious gaps. The approval process involved individual communication with each steering committee member in advance of the detailed review meetings. While the process was protracted, the document was ultimately approved. The resultant document has had both minor and major revisions over the years but stands as one of the first computing security control standards.

Promoting Awareness

Does it Translate into Action

I attended a conference among government and contractor personnel on the topic of Operations Security (OPSEC). One session was on Awareness, which sounded like a key topic given the pervasive need for personnel to implement OPSEC practices. The speaker, a dog lover, primarily showed off her OPSEC awareness posters. A picture of her dog filled one poster, which had the heading “Remember OPSEC.” Wow, that was a pretty vague statement of what the viewer was expected to do.

What I had expected to see was all the awareness methods (not just posters) that were designed to motivate the viewer to take a specific action.

Introduction to Computing Security

It’s not all about Technology

When I joined computing security in 1982 it was part of the function responsible for protecting people, property and information. This encompasses not only security responsibilities, but also fire protection and response. My reaction as a techie was that these non-computing functions had nothing to do with me.

At one point I was volunteered by my manager to be our group’s safety coordinator. One component was to meet with the other safety coordinators in the security and fire protection function monthly. As a result, I not only got to know these people in the various functions, but also learned some things about their functions. One person I met was the Fire Marshall, who was responsible for the standards used across the many locations where the organization operated. He also coordinated fire protection amongst his peers in the surrounding municipality fire departments. He was a champion of change in the national fire codes, which are standards for reducing fire risk; a very interesting model to consider when thinking about computing security control frameworks.

It is common for documents and web sites devoted to computing security to have limited if any reference to non-computing security controls. The reality is that the technical controls are reliant on the protections of physical security controls, and that all security controls can be no better than the trustworthiness of the people who create, operate and work within the controlled environment.

Identifying Sensitive Information

What not to Ask

In the early 1990’s I was the security leader for the company’s major product development program. There was unprecedented international involvement, which include international digital data exchange of engineering digital design models. Some of our international suppliers worked side-by-side with our engineers in company facilities. There was a lot of angst about what the suppliers should and did have access to, and varied beliefs about what information was particularly sensitive.

To resolve the information sensitivity question, I arranged a meeting with the program leader. I asked the question: “What information should the supplier personnel be permitted to access?” The answer provided was: “Any information they need to do their job, and nothing more.” That means administration of access on a person-by-person basis, which we were in no way equipped to accomplish.

I realized that in my one chance I asked the wrong question. A better approach would have been to describe at a high level the current access control policy and how it was administered, and propose changes anticipating what I could intelligently guess was his expectation. It would also have been useful to outline what people thought was important to protect (i.e. the most important Trade Secrets) to confirm or correct the understanding. In either case, there would have been the opportunity for follow-up to gain formal approval of a written direction based on the discussion.

Identifying Secrets to Protect

 How to Begin

This true story took place in early 2001, which is important only to give a context of the threat environment of the time. In this case the competitor was the primary threat of concern.

I was the manager for the company sensitive information protection program including support for the product development organization. One day a staff member contacted me and said it was urgent that I come meet with John.

In the meeting, John shared the high level for initiating development on the next generation commercial airplane, leveraging the strongest minds across the corporation. John was under orders from his boss, the organization leader Alan, that the program must be absolutely secret. The consequence of a leak would be that John, the program leader, would be fired. I was asked if we would take responsibility for the information protection program. He needed to start implementing the security plan in one week.

I returned to my office, arranging a meeting with my boss and the Chief Security Officer. They supported my recommendation that we accept the assignment. The goal of the security plan was preventing disclosure which was most likely to occur by program insiders. A particular scenario was the threat associated social engineering by the competitor, since they clearly engaged in business intelligence, and were widely rumored to engage in espionage.

The security plan was modeled as much as possible on the approach to protecting government classified information, particularly for personnel security (limited need-to-know list, screening and approving access, non-disclosure agreements) and physical security (access-controlled areas, storage containers, alarms). However, given the timeframe and technology of the time, stand-alone computing was not entirely feasible. Many of the deliverables were in the form of Microsoft Office products; the personal computers could be limited to the program areas, closely restricted access to file shares. The balance of the computing was compute-intensive numerical analysis. While this data was the foundation of the program’s success, on its own it appeared no different than other analysis and could be disassociated from the program.

With the security plan completed and approved by my management, I was ready to meet with the program leaders. I did not expect them all to read the plan, so I prepared for a conversation about how the security program needed to operate. This was a presentation of 6 Vufoils for the overhead projector (remember this is before conference room projectors became common). John introduced me, and I said a few introductory words when he interrupted me. His remarks covered my first slide, and the same happened for all 6 slides. So, I had the remarkable luck to have a leader who intuitively understood most of what needed to happen, and that cemented support by the leadership team.

Except, I had one more thing to say: “John, who is owner of your information assets, and decides how much protection is enough?” “Well, I am.” To which his leadership team unanimously dissented; they reminded John that he was mostly not available for day-to-day management. He relented and appointed his deputy program manager, Pete, as the owner. Pete reviewed and approved the security plan. Whenever there was a program team meeting, Pete was the one who led any discussion of security, which demonstrated to the team the leaders’ commitment and support of the plan. We never had any complaints about why the security plan required what it did.

I’ll skip the operational details, not because they lack interest, but they don’t contribute to the lesson.

During all this time, John and Bill were talking with customers. Their goal was to elicit requirements and assess interest. I can only imagine how they did this without showing their hand too soon, which would have potentially compromised the competitive advantage. At one point Alan apparently judged the time was right, and publicly disclosed the program.

The existence of the program was no longer a secret. The leaders decided to continue the security plan, which actually became more difficult. It is pretty easy to explain to people that they mustn’t share anything about the program. It is much harder to get leaders to establish a sensitive information protection guide that identifies the categories of information to protect and relates the category to the protections required (consider government classification guides that relate categories of information to Confidential, Secret, and Top Secret protections).

Establishing Policy for Information Resources

Opportunity to Have Influence

All companies have some form of corporate policy starting with the delegation of authority then defining general goals for operating the organization. At one point in my career, my company rewrote the corporate policies, reducing the number down to about a dozen. Some years later, some company leaders realized that there was no acknowledgement that information was a resource of the company that should be formally managed.

Part of my job had been to maintain company procedures that implemented corporate policies, which forced me to be familiar with the old policy system and learn how to work within the new system. My director asked me to propose a draft of a corporate policy on information resources. The structure and outline of the policy was fixed, but the content was a clean slate. I referred to the old policy system for applicable content and learned about legal concepts related to intellectual property.

My director enthusiastically accepted the draft, then I heard nothing more for months. What finally emerged as approved policy had many refinements from my draft, but roughly 80% of my initial content remained. This experience reinforced how much influence a person at a lower level can have when given the opportunity to prepare draft policy and procedure documents.