[PODCAST] How To Apply Critical Thought & Socratic Methods to Build Defensible IT Security Investments – with Jack Jones, Chairman of The FAIR Institute

Jack JonesNEW NAME

This episode is sponsored by the CIO Scoreboard

Today I had an interesting conversation with Jack Jones. This is Jack’s second time on the show and I loved our discussion. It is a gem of learning and is packed with information that you can use right away. Jack was one of the first CISOs in the United States and he is the inventor of the FAIR model for analyzing Information Security Risk. Jack’s bio is extensive and here is a short list of his accomplishments.

6 Key Points:

1. Why top 10 lists for IT Security are useless
2. How to add probability and possibility of events happening into your risk models
3. How to present data that your board of directors will love
4. How to develop range into your communication
5. How to apply critical thinking, logic and Socratic methods to your analysis
6. How to apply rigor in developing a defensible argument

Jack Jones

Jack Jones has worked in technology for over 30 years, and information security and risk management for 25 years. He has over nine years of experience as a CISO with three different companies, including five years at a Fortune 100 financial services company. He received the ISSA Excellence in the Field of Security Practices award at the 2006 RSA Conference. In 2007, he was selected as a finalist for the Information Security Executive of the Year, Central United States, and in 2012 was honored with the CSO Compass award for leadership in risk management. Jones is also the author and creator of the Factor Analysis of Information Risk (FAIR) framework. Currently, Jones serves on the ISC2 Ethics Committee, and is the Executive Vice President, Research and Development of Risk Lens, Inc.

Suffice it to say that Jack is a rock star in the Information Security and IT Security risk community!

Jack Jones Web Small

Sponsored By:


This episode is sponsored by the CIO Security Scoreboard, a powerful tool that helps you communicate the status of your IT Security program visually in just a few minutes.


Key Resources:

Time Stamped Show Notes:

• FAIR is a framework of critical thinking and model or codification of risk and how risk works. Provides reference for thing through complex risk problem problems, risk assumptions and enabling risk discussions [04:53]
• Servicing assumption enabling debate like dialogue in this discussion [05:15]
• Jack Jones one of the first CISOs. CISO late 1980s. How to present risk? Technique with FAIR possibility vs probability what is it? Eg. McAfee virus impacting company and disrupting operations. Genesis was a 2003 XP system that contractor required them to have on their network. Sophisticated tools. Blindsided for a few days – because an admin was using a personal machine for surfing, so how would somebody apply FAIR. Knew administrator issues. How do you apply FAIR analysis to this? [08:49]
• In organization that knows it has control deficiencies. In doing risk analysis of landscape and threat landscape we face are the scenarios that could be painful. Develop straightforward taxonomy and availability high level. From confidentiality perspective what are assets would be exposed from and integrity perspective. [10:00]
• Deeper level of granularity – step-by-step process develop Taxonomy of events that represents loss. Then analyze likelihood of loss [10:39]
• If organization done that and they might have, when there is significant impact even if the likelihood is low – controls you want to be able to fast detection and recovery. If down for three days, then recovery rate not what is should be. Organization – in a rigorous fashion – lay out the risk landscape which on the surface they understand exist but don’t know where it’s relative to the other things in their landscape. Way they triage their world and identify set of conditions – work to be done and could have prioritized it more effectively [12:20]
• Concept of probability vs possibility linked to Russian Roulette. Organizations fall into the trap of possibility and not probability considerations. If we Focus solely on events are conceivably possible and hugely painful – an asteroid strike would come up and what we would do for an asteroid strike. There has to be a probability element – you can’t just solely focus on possibility. Possibility of bad events 100-percent but probability might be lower. Crucial in order to prioritize. [14:38]
• If there was a risk with old systems because of the admin issue it would have and fitted access to work things out how would you reverse engineer that situation [15:09]
• In that instance – high probability of encountering malware – the only question from a probability perspective is what are odds of encountering malware that their preventative measures aren’t going to handle. Most security professionals would say that that could happen with the regularity so probability is higher. From a threat perspective zero Day stuff happens with some regularity – and we would be able to come up with likelihood estimate. One of the factors that place into the likelihood is the administrative privilege exposure. What it does is it allows the malware to have greater control and broader Impact than otherwise [17:35]
• Patching situation would be factors in the evaluation as well but they might have – fragile state wholly dependent on that malware situation due to administrative situation and patching situation. They just fragile to the single control element. Within FAIR there is probability and impact and also2 states: 1) fragile depending on single control in an active threat landscape and the other is 2) unstable where an asset you want to protect that exists in a not very active landscape but you don’t have any preventative or resistance control. Eg. databases – evaluating scenario rogue database administrators. Nothing to stop it. So when you identify unstable conditions you look at how you would resolve and detect a situation because you have no resistant option. [19:36]
• In evaluating Probability and Impact and two qualifiers fragile and unstable [20:01]
• How do you estimate likelihood of happening. All kinds of downsides to scales. Doesn’t allow you to effectively articulate best case, worst case, & most likely case – range of outcomes. From a probability perspective not a lot of work to look at industry data relevant to Technologies in this particular organization. Two ends of the spectrum. Do you see the trends what’s more or fewer? Using the data set the minimum at 5 that are relevant to technology concerned about Maximum 15 or perhaps 15 or 20 – per year. Depending on quality of data – make the Range wider or narrower. Faithfully representing your range of uncertainty is critical. Put a discrete number. I don’t want number I want a range. Two dimensions. The width of the range. And the most likely value how flat or sharply peaked to B. Perk distribution. Expressing range of uncertainty. [24:09]
• Interesting in profession when you try to quantify something precision take the distance second to accuracy. When I give you a range that incorporates the actual outcomes in my Range – then my range is accurate and you increase probability of accuracy with wider ranges – but diminished returns [26:25]
• The useful degree of precision with a confidence level you can stand behind – Process of Calibration, How to Measure Anything – Douglas Hubbard a book that covers this beautifully [26:44]
• Utility for decision-making vs estimating concept, in expression ranges – when presenting risk to use decision makers trying to influence decision to make buying decisions. Calibration piece helps the decision maker make this decision [28:59]
• Blog series written about this – look at ordinal scales organizations rely on. HIGH MEDIUM LOW. They will identify top ten risks they are identified 10 things in the landscape that they would place into a high risk bucket. Top 3 – how do you differentiate in that bucket when choosing why things don’t go into the bucket people. Can’t identify why things don’t go into that bucket they don’t think things through with sufficient rigor. [30:25]
• Not very effective if you use quantitative measures quantitative measures allows you to distribute one above another I would focus on the thing that I have less certainty on. The lack of certainty is risk factor that needs to be dealt with [31:50]
• Telescopic piece and level of sophistication is not sufficiently advanced to explain to business decision maker to explain why they can’t spend money in that area so will spend money in this area. How can someone reconcile real security and audit findings – which are at odds [33:46]
• Key component is applying real rigor to developing scenarios when encryption at rest is relevant. Encrypt your hard drive – very useful. But a lot of scenarios where the data can be compromised and encryption increases risk. Define set of scenarios where data is at risk in that subset where is encryption adds value and where not. Then evaluating impact. Then have means for comparing solutions. [36:35]
• Playing at the scenarios is sufficient for people to realize which options are better. [37:05]
• Set of control opportunities that cost a fraction and show through analysis how it reduces risk more than encryption. [37:38]
• Some IT professionals feels that (engagement) implies combat. They feel they are protecting an organization so we are asking a government entity auditor but what about educating people to prevent risk. [38:55]
• People are hesitant to go toe-to-toe against a regulator auditor –operating from intuition. They haven’t applied rigorous approach to developing argument – sometimes intuition is wrong and then you realize there right. That’s ok. But very often intuition is right. Need framework (like FAIR) for critical thinking through complex problems and developing argument and rationale and surface assumptions making estimates – put before the auditors, if you go through the process to the authoritative figure have you has not applied any rigor to it [40:35]
• Critical thinking, the Socratic method, logical way of thinking. Interesting to back-up intuition with a rigorous reproach to have a defensible argument [41:21]
• Save looking at problems and potential Solutions and more rigorous critical-thinking-like fashion is hugely valuable. Just having the framework for discussing and debating things – hugely valuable. [42:27]
• Another component is normalizing terminology. [43:02]
• FAIR model – really valuable. Every organization’s risk summary includes top 10 risks and that includes cybercriminals, social engineering, change management, mobile media and cloud computing. And if you look at those – cybercriminal threat community and cloud computing – technology, change management is a control element. It’s like comparing apples and oranges. Those are not loss scenarios. FAIR Institute Blog that discusses this. How organizations are identifying and managing top 10 risks and it’s a huge problem. We cannot expect to mature if we can’t get a fundamental nomenclature correct [45:53]
• What are the easy steps that someone can transform the top 10 list lost scenarios change the top 10 list? [46:21]
• Create 2 lists of the top loss scenarios – taxonomy is a list of outcomes. Taxonomy is a categorization. Categorize loss events to a level of abstraction that’s balanced. Balance to be struck. easy to recognize with that balance lies as you go through the process. Qualitatively or quantitatively then do a probability & impact around those and that will tell you which off top 5 or 10. [48:02]
• Other list – control deficiencies. Risk assessment is controlled assessment. How to prioritize what contributes most of this risk. That identifies top control positions. Cant mix together. Simple way – get handle on risk landscape and determine focus. Look at list of top 10 deficiencies – map them to which scenarios highly relevant less likely relevant – these three or four need to be hitting these hard. We can say over time this will reduce or change this list scenario. [49.24]
• Recognizing you have to have two lists – top 10 less list is worse than useless you can’t compare because it’s misinformation in the worst way [49:47]
• Recommend Measuring and Managing Information Risk: A FAIR Approach co-authored with Dr. Jack Freund. FAIR Institute where to get education at the ecosystem of people in organization to Leverage framework. Universities taking part. Institute, free copy of book but different membership levels soft launch in December formal launch in February [52:10]
• The Opengroup.org (owns IP for Unix) has resources for FAIR and certification for practitioners. Risk Lens blog resources case studies and the book [52:22]
• Risk lens does fair Consulting and Open Group is organization but only intellectual property and they adopted her Institute have found her [53:06]

How to Get in Touch with Jack:

Credits:

This episode is sponsored by the CIO Scoreboard

Communicate the Status of Your IT Security in 2 minutes

Other Ways To Listen to the Podcast 

iTunes | Stitcher | Libsyn | Soundcloud | RSS Feed | LinkedIn

Leave a Review

If you enjoyed this episode, then please consider leaving an iTunes review here.

Click here for instructions on how to leave a review if you’re doing this for the first time.

About Bill Murphy

Bill Murphy is a world renowned IT Security Expert dedicated to your success as an IT business leader. Follow Bill on LinkedIn and Twitter.